Full Code of remind101/stacker for AI

master b357f83596e0 cached
226 files
802.1 KB
188.4k tokens
1202 symbols
1 requests
Download .txt
Showing preview only (864K chars total). Download the full file or copy to clipboard to get everything.
Repository: remind101/stacker
Branch: master
Commit: b357f83596e0
Files: 226
Total size: 802.1 KB

Directory structure:
gitextract_put1k2j_/

├── .circleci/
│   └── config.yml
├── .dockerignore
├── .gitignore
├── AUTHORS.rst
├── CHANGELOG.md
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── Dockerfile
├── LICENSE
├── Makefile
├── README.rst
├── RELEASE.md
├── codecov.yml
├── conf/
│   └── README.rst
├── docs/
│   ├── .gitignore
│   ├── Makefile
│   ├── api/
│   │   ├── modules.rst
│   │   ├── stacker.actions.rst
│   │   ├── stacker.blueprints.rst
│   │   ├── stacker.blueprints.variables.rst
│   │   ├── stacker.commands.rst
│   │   ├── stacker.commands.stacker.rst
│   │   ├── stacker.config.rst
│   │   ├── stacker.config.translators.rst
│   │   ├── stacker.hooks.rst
│   │   ├── stacker.logger.rst
│   │   ├── stacker.lookups.handlers.rst
│   │   ├── stacker.lookups.rst
│   │   ├── stacker.providers.aws.rst
│   │   ├── stacker.providers.rst
│   │   └── stacker.rst
│   ├── blueprints.rst
│   ├── commands.rst
│   ├── conf.py
│   ├── config.rst
│   ├── environments.rst
│   ├── index.rst
│   ├── lookups.rst
│   ├── organizations_using_stacker.rst
│   ├── templates.rst
│   ├── terminology.rst
│   └── translators.rst
├── examples/
│   └── cross-account/
│       ├── .aws/
│       │   └── config
│       ├── README.md
│       ├── stacker.yaml
│       └── templates/
│           ├── stacker-bucket.yaml
│           └── stacker-role.yaml
├── requirements.in
├── scripts/
│   ├── compare_env
│   ├── docker-stacker
│   ├── stacker
│   └── stacker.cmd
├── setup.cfg
├── setup.py
├── stacker/
│   ├── __init__.py
│   ├── actions/
│   │   ├── __init__.py
│   │   ├── base.py
│   │   ├── build.py
│   │   ├── destroy.py
│   │   ├── diff.py
│   │   ├── graph.py
│   │   └── info.py
│   ├── awscli_yamlhelper.py
│   ├── blueprints/
│   │   ├── __init__.py
│   │   ├── base.py
│   │   ├── raw.py
│   │   ├── testutil.py
│   │   └── variables/
│   │       ├── __init__.py
│   │       └── types.py
│   ├── commands/
│   │   ├── __init__.py
│   │   └── stacker/
│   │       ├── __init__.py
│   │       ├── base.py
│   │       ├── build.py
│   │       ├── destroy.py
│   │       ├── diff.py
│   │       ├── graph.py
│   │       └── info.py
│   ├── config/
│   │   ├── __init__.py
│   │   └── translators/
│   │       ├── __init__.py
│   │       └── kms.py
│   ├── context.py
│   ├── dag/
│   │   └── __init__.py
│   ├── environment.py
│   ├── exceptions.py
│   ├── hooks/
│   │   ├── __init__.py
│   │   ├── aws_lambda.py
│   │   ├── command.py
│   │   ├── ecs.py
│   │   ├── iam.py
│   │   ├── keypair.py
│   │   ├── route53.py
│   │   └── utils.py
│   ├── logger/
│   │   └── __init__.py
│   ├── lookups/
│   │   ├── __init__.py
│   │   ├── handlers/
│   │   │   ├── __init__.py
│   │   │   ├── ami.py
│   │   │   ├── default.py
│   │   │   ├── dynamodb.py
│   │   │   ├── envvar.py
│   │   │   ├── file.py
│   │   │   ├── hook_data.py
│   │   │   ├── kms.py
│   │   │   ├── output.py
│   │   │   ├── rxref.py
│   │   │   ├── split.py
│   │   │   ├── ssmstore.py
│   │   │   └── xref.py
│   │   └── registry.py
│   ├── plan.py
│   ├── providers/
│   │   ├── __init__.py
│   │   ├── aws/
│   │   │   ├── __init__.py
│   │   │   └── default.py
│   │   └── base.py
│   ├── session_cache.py
│   ├── stack.py
│   ├── status.py
│   ├── target.py
│   ├── tests/
│   │   ├── __init__.py
│   │   ├── actions/
│   │   │   ├── __init__.py
│   │   │   ├── test_base.py
│   │   │   ├── test_build.py
│   │   │   ├── test_destroy.py
│   │   │   └── test_diff.py
│   │   ├── blueprints/
│   │   │   ├── __init__.py
│   │   │   ├── test_base.py
│   │   │   ├── test_raw.py
│   │   │   └── test_testutil.py
│   │   ├── conftest.py
│   │   ├── factories.py
│   │   ├── fixtures/
│   │   │   ├── __init__.py
│   │   │   ├── basic.env
│   │   │   ├── cfn_template.json
│   │   │   ├── cfn_template.json.j2
│   │   │   ├── cfn_template.yaml
│   │   │   ├── keypair/
│   │   │   │   ├── fingerprint
│   │   │   │   ├── id_rsa
│   │   │   │   └── id_rsa.pub
│   │   │   ├── mock_blueprints.py
│   │   │   ├── mock_hooks.py
│   │   │   ├── mock_lookups.py
│   │   │   ├── not-basic.env
│   │   │   ├── parameter_resolution/
│   │   │   │   └── template.yml
│   │   │   ├── vpc-bastion-db-web-pre-1.0.yaml
│   │   │   ├── vpc-bastion-db-web.yaml
│   │   │   └── vpc-custom-log-format-info.yaml
│   │   ├── hooks/
│   │   │   ├── __init__.py
│   │   │   ├── test_aws_lambda.py
│   │   │   ├── test_command.py
│   │   │   ├── test_ecs.py
│   │   │   ├── test_iam.py
│   │   │   └── test_keypair.py
│   │   ├── lookups/
│   │   │   ├── __init__.py
│   │   │   ├── handlers/
│   │   │   │   ├── __init__.py
│   │   │   │   ├── test_ami.py
│   │   │   │   ├── test_default.py
│   │   │   │   ├── test_dynamodb.py
│   │   │   │   ├── test_envvar.py
│   │   │   │   ├── test_file.py
│   │   │   │   ├── test_hook_data.py
│   │   │   │   ├── test_output.py
│   │   │   │   ├── test_rxref.py
│   │   │   │   ├── test_split.py
│   │   │   │   ├── test_ssmstore.py
│   │   │   │   └── test_xref.py
│   │   │   └── test_registry.py
│   │   ├── providers/
│   │   │   ├── __init__.py
│   │   │   └── aws/
│   │   │       ├── __init__.py
│   │   │       └── test_default.py
│   │   ├── test_config.py
│   │   ├── test_context.py
│   │   ├── test_dag.py
│   │   ├── test_environment.py
│   │   ├── test_lookups.py
│   │   ├── test_parse_user_data.py
│   │   ├── test_plan.py
│   │   ├── test_stack.py
│   │   ├── test_stacker.py
│   │   ├── test_util.py
│   │   └── test_variables.py
│   ├── tokenize_userdata.py
│   ├── ui.py
│   ├── util.py
│   └── variables.py
├── test-requirements.in
└── tests/
    ├── Makefile
    ├── README.md
    ├── cleanup_functional_test_buckets.sh
    ├── fixtures/
    │   ├── blueprints/
    │   │   └── test_repo.json
    │   └── stack_policies/
    │       ├── default.json
    │       └── none.json
    ├── run_test_suite.sh
    ├── stacker.yaml.sh
    ├── test_helper.bash
    └── test_suite/
        ├── 01_stacker_build_no_config.bats
        ├── 02_stacker_build_empty_config.bats
        ├── 03_stacker_build-config_with_no_stacks.bats
        ├── 04_stacker_build-config_with_no_namespace.bats
        ├── 05_stacker_build-missing_environment_key.bats
        ├── 06_stacker_build-duplicate_stacks.bats
        ├── 07_stacker_graph-json_format.bats
        ├── 08_stacker_graph-dot_format.bats
        ├── 09_stacker_build-missing_variable.bats
        ├── 10_stacker_build-simple_build.bats
        ├── 11_stacker_info-simple_info.bats
        ├── 12_stacker_build-simple_build_with_output_lookups.bats
        ├── 13_stacker_build-simple_build_with_environment.bats
        ├── 14_stacker_build-interactive_with_skipped_update.bats
        ├── 15_stacker_build-no_namespace.bats
        ├── 16_stacker_build-overriden_environment_key_with_-e.bats
        ├── 17_stacker_build-dump.bats
        ├── 18_stacker_diff-simple_diff_with_output_lookups.bats
        ├── 19_stacker_build-replacements-only_test_with_additional_resource_no_keyerror.bats
        ├── 20_stacker_build-locked_stacks.bats
        ├── 21_stacker_build-default_mode_without_&_with_protected_stack.bats
        ├── 22_stacker_build-recreate_failed_stack_non-interactive_mode.bats
        ├── 23_stacker_build-recreate_failed_stack_interactive_mode.bats
        ├── 24_stacker_build-handle_rollbacks_during_updates.bats
        ├── 25_stacker_build-handle_rollbacks_in_dependent_stacks.bats
        ├── 26_stacker_build-raw_template.bats
        ├── 27_stacker_diff-raw_template.bats
        ├── 28_stacker_build-raw_template_parameter_resolution.bats
        ├── 29_stacker_build-no_parallelism.bats
        ├── 30_stacker_build-tailing.bats
        ├── 31_stacker_build-override_stack_name.bats
        ├── 32_stacker_build-multi_region.bats
        └── 33_stacker_build-profiles.bats

================================================
FILE CONTENTS
================================================

================================================
FILE: .circleci/config.yml
================================================
version: 2

workflows:
  version: 2
  test-all:
    jobs:
      - lint
      - unit-test-37:
          requires:
            - lint
      - functional-test-37:
          requires:
            - unit-test-37
      - unit-test-38:
          requires:
            - lint
      - functional-test-38:
          requires:
            - unit-test-38
            - functional-test-37
      - unit-test-39:
          requires:
            - lint
      - functional-test-39:
          requires:
            - unit-test-39
            - functional-test-38
      - unit-test-310:
          requires:
            - lint
      - functional-test-310:
          requires:
            - unit-test-310
            - functional-test-39
      - cleanup-functional-buckets:
          requires:
            - functional-test-37
            - functional-test-38
            - functional-test-39
            - functional-test-310

jobs:
  lint:
    docker:
      - image: circleci/python:3.7
    steps:
      - checkout
      - run: sudo pip install flake8 codecov pep8-naming
      - run: sudo python setup.py install
      - run: flake8 --version
      - run: sudo make lint

  unit-test-37:
    docker:
      - image: circleci/python:3.7
    steps: &unit_test_steps
      - checkout
      - run: sudo python setup.py install
      - run: sudo make test-unit

  unit-test-38:
    docker:
      - image: circleci/python:3.8
    steps: *unit_test_steps

  unit-test-39:
    docker:
      - image: circleci/python:3.9
    steps: *unit_test_steps

  unit-test-310:
    docker:
      - image: circleci/python:3.10
    steps: *unit_test_steps

  functional-test-37:
    docker:
      - image: circleci/python:3.7
    steps: &functional_test_steps
      - checkout
      - run:
          command: |
            git clone https://github.com/bats-core/bats-core.git
            cd bats-core
            git checkout v1.0.2
            sudo ./install.sh /usr/local
            bats --version
      - run: sudo python setup.py install
      - run:
          command: |
            export TERM=xterm
            export AWS_DEFAULT_REGION=us-east-1
            export STACKER_NAMESPACE=cloudtools-functional-tests-$CIRCLE_BUILD_NUM
            export STACKER_ROLE=arn:aws:iam::459170252436:role/cloudtools-functional-tests-sta-FunctionalTestRole-1M9HFJ9VQVMFX
            sudo -E make test-functional

  functional-test-38:
    docker:
      - image: circleci/python:3.8
    steps: *functional_test_steps

  functional-test-39:
    docker:
      - image: circleci/python:3.9
    steps: *functional_test_steps

  functional-test-310:
    docker:
      - image: circleci/python:3.10
    steps: *functional_test_steps

  cleanup-functional-buckets:
    docker:
      - image: circleci/python:3.7
    steps:
      - checkout
      - run:
          command: |
            tests/cleanup_functional_test_buckets.sh


================================================
FILE: .dockerignore
================================================
Dockerfile


================================================
FILE: .gitignore
================================================
# Compiled source #
###################
*.com
*.class
*.dll
*.exe
*.o
*.so

# Packages #
############
# it's better to unpack these files and commit the raw source
# git has its own built in compression methods
*.7z
*.dmg
*.gz
*.iso
*.jar
*.rar
*.tar
*.zip

# Logs and databases #
######################
*.log
*.sql
*.sqlite

# OS generated files #
######################
.DS_Store*
ehthumbs.db
Icon?
Thumbs.db

# Vagrant
.vagrant
Vagrantfile

# Editor crap
*.sw*
*~
.idea
*.iml

# Byte-compiled python
*.pyc

# Package directory
build/

# Build object file directory
objdir/
dist/
*.egg-info
.eggs/
*.egg

# Coverage artifacts
.coverage
htmlcov

# Ignore development conf/env files
dev.yaml
dev.env
tests/fixtures/blueprints/*-result
FakeKey.pem
vm_setup.sh


================================================
FILE: AUTHORS.rst
================================================
Authors
=======

Stacker was designed and developed by the OpsEng team at `Remind, Inc.`_

Current Maintainers
-------------------

- `Michael Barrett`_
- `Eric Holmes`_
- `Ignacio Nin`_
- `Russell Ballestrini`_

Alumni
------

- `Michael Hahn`_
- `Tom Taubkin`_

Thanks
------

Stacker wouldn't be where it is today without the open source community that
has formed around it. Thank you to everyone who has contributed, and special
thanks to the following folks who have contributed great features and bug
requests, as well as given guidance in stacker's development:

- `Adam McElwee`_
- `Daniel Miranda`_
- `Troy Ready`_
- `Garison Draper`_
- `Mariusz`_
- `Tolga Tarhan`_

.. _`Remind, Inc.`: https://www.remind.com/

.. _`Michael Barrett`: https://github.com/phobologic
.. _`Eric Holmes`: https://github.com/ejholmes
.. _`Ignacio Nin`: https://github.com/Lowercases
.. _`Russell Ballestrini`: https://github.com/russellballestrini

.. _`Michael Hahn`: https://github.com/mhahn
.. _`Tom Taubkin`: https://github.com/ttaub

.. _`Adam McElwee`: https://github.com/acmcelwee
.. _`Daniel Miranda`: https://github.com/danielkza
.. _`Troy Ready`: https://github.com/troyready
.. _`Garison Draper`: https://github.com/GarisonLotus
.. _`Mariusz`: https://github.com/discobean
.. _`Tolga Tarhan`: https://github.com/ttarhan


================================================
FILE: CHANGELOG.md
================================================
## Upcoming release

## 1.7.2 (2020-11-09)
- address breaking moto change to awslambda [GH-763]
- Added Python version validation before update kms decrypt output [GH-765]

## 1.7.1 (2020-08-17)
- Fixing AMI lookup Key error on 'Name'
- hooks: lambda: allow uploading pre-built payloads [GH-564]
- Ensure that base64 lookup codec encodes the bytes object as a string [GH-742]
- Use CloudFormation Change Sets for `stacker diff`
- Locked stacks still have requirements [GH-746]
- change diff to use CFN change sets instead of comparing template dicts [GH-744]
- Add YAML environment file support [GH-740]
- fix `stack.set_outputs` not being called by diff if stack did not change [GH-754]
- Fix python 2.7/3.5 dependency issue
- add cf notification arns [GH-756]

## 1.7.0 (2019-04-07)

- Additional ECS unit tests [GH-696]
- Keypair unit tests [GH-700]
- Jinja2 templates in plain cloudformation templates [GH-701]
- Custom log output formats [GH-705]
- Python 3.7 unit tests in CircleCI [GH-711]
- Upload blueprint templates with bucket-owner-full-control ACL [GH-713]
- Change test runner from nose to py.test [GH-714]
- support for importing a local public key file with the keypair hook [GH-715]
- support for storing private keys in SSM parameter store with the keypair hook [GH-715]

## 1.6.0 (2019-01-21)

- New lookup format/syntax, making it more generic [GH-665]
- Allow lowercase y/Y when prompted [GH-674]
- Local package sources [GH-677]
- Add `in_progress` option to stack config [GH-678]
- Use default ACL for uploaded lambda code [GH-682]
- Display rollback reason after error [GH-687]
- ssm parameter types [GH-692]

## 1.5.0 (2018-10-14)

The big feature in this release is the introduction of "targets" which act as
sort of "virtual nodes" in the graph. It provides a nice way to logically group
stacks.

- Add support for "targets" [GH-572]
- Fix non-interactive changeset updates w/ stack policies [GH-657]
- Fix interactive_update_stack calls with empty string parameters [GH-658]
- Fix KMS unicode lookup in python 2 [GH-659]
- Locked stacks have no dependencies [GH-661]
- Set default profile earlier [GH-662]
- Get rid of recursion for tail retries and extend retry/timeout [GH-663]

## 1.4.1 (2018-08-28)

This is a minor bugfix release for 1.4.0, no major feature updates.

As of this release python 3.5+ support is no longer considered experimental, and should be stable.

Special thanks to @troyready for this release, I think most of these PRs were his :)

- allow raw cfn templates to be loaded from remote package\_sources [GH-638]
- Add missing config keys to s3 package source model [GH-642]
- Account for UsePreviousValue parameters in diff [GH-644]
- fix file lookup documented and actual return types [GH-646]
- Creates a memoized provider builder for AWS [GH-648]
- update git ref to explicitly return string (fix py3 bytes error) [GH-649]
- Lock botocore/boto to versions that work with moto [GH-651]

## 1.4.0 (2018-08-05)

- YAML & JSON codecs for `file` lookup [GH-537]
- Arbitrary `command` hook [GH-565]
- Fix datetime is not JSON serializable error [GH-591]
- Run dump and outline actions offline [GH-594]
- Helper Makefile for functional tests [GH-597]
- Python3 support!!! [GH-600]
- YAML blueprint testing framework [GH-606]
- new `add_output` helper on Blueprint [GH-611]
- Include lookup contents when lookups fail [GH-614]
- Fix issue with using previous value for parameters [GH-615]
- Stricter config parsing - only allow unrecognized config variables at the top-level [GH-623]
- Documentation for the `default` lookup [GH-636]
- Allow configs without stacks [GH-640]

## 1.3.0 (2018-05-03)

- Support for provisioning stacks in multiple accounts and regions has been added [GH-553], [GH-551]
- Added a `--profile` flag, which can be used to set the global default profile that stacker will use (similar to `AWS_PROFILE`) [GH-563]
- `class_path`/`template_path` are no longer required when a stack is `locked` [GH-557]
- Support for setting stack policies on stacks has been added [GH-570]

## 1.2.0 (2018-03-01)

The biggest change in this release has to do with how we build the graph
of dependencies between stacks. This is now a true DAG.  As well, to
speed up performance we now walk the graph in a threaded mode, allowing
true parallelism and speeding up "wide" stack graphs considerably.

- assertRenderedBlueprint always dumps current results [GH-528]
- The `--stacks` flag now automatically builds dependencies of the given stack [GH-523]
- an unecessary DescribeStacks network call was removed [GH-529]
- support stack json/yaml templates [GH-530]
- `stacker {build,destroy}` now executes stacks in parallel. Parallelism can be controled with a `-j` flag. [GH-531]
- logging output has been simplified and no longer uses ANSI escape sequences to clear the screen [GH-532]
- logging output is now colorized in `--interactive` mode if the terminal has a TTY [GH-532]
- removed the upper bound on the boto3 dependency [GH-542]

## 1.2.0rc2 (2018-02-27)

- Fix parameter handling for diffs [GH-540]
- Fix an issue where SIGTERM/SIGINT weren't handled immediately [GH-543]
- Log a line when SIGINT/SIGTERM are handled [GH-543]
- Log failed steps at the end of plan execution [GH-543]
- Remove upper bound on boto3 dependency [GH-542]

## 1.2.0rc1 (2018-02-15)

The biggest change in this release has to do with how we build the graph
of dependencies between stacks. This is now a true DAG.  As well, to
speed up performance we now walk the graph in a threaded mode, allowing
true parallelism and speeding up "wide" stack graphs considerably.

- assertRenderedBlueprint always dumps current results [GH-528]
- stacker now builds a DAG internally [GH-523]
- The `--stacks` flag now automatically builds dependencies of the given stack [GH-523]
- an unecessary DescribeStacks network call was removed [GH-529]
- support stack json/yaml templates [GH-530]
- `stacker {build,destroy}` now executes stacks in parallel. Parallelism can be controled with a `-j` flag. [GH-531]
- logging output has been simplified and no longer uses ANSI escape sequences to clear the screen [GH-532]
- logging output is now colorized in `--interactive` mode if the terminal has a TTY [GH-532]


## 1.1.4 (2018-01-26)

- Add `blueprint.to_json` for standalone rendering [GH-459]
- Add global config for troposphere template indent [GH-505]
- Add serverless transform/CREATE changeset types [GH-517]

## 1.1.3 (2017-12-23)

Bugfix release- primarily to deal with a bug that's been around since the
introduction of interactive mode/changesets. The bug primarily deals with the
fact that we weren't deleting Changesets that were not submitted. This didn't
affect anyone for the longest time, but recently people have started to hit
limits on the # of changesets in an account. The current thinking is that the
limits weren't enforced before, and only recently has been enforced.

- Add S3 remote package sources [GH-487]
- Make blueprint dump always create intermediate directories [GH-499]
- Allow duplicate keys for most config mappings except `stacks` [GH-507]
- Remove un-submitted changesets [GH-513]

## 1.1.2 (2017-11-01)

This is a minor update to help deal with some of the issues between `stacker`
and `stacker_blueprints` both having dependencies on `troposphere`. It loosens
the dependencies, allowing stacker to work with any reasonably new version
of troposphere (anything greater than `1.9.0`). `stacker_blueprints` will
likely require newer versions of troposphere, as new types are introduced to
the blueprints, but it's unlikely we'll change the `troposphere` version string
for stacker, since it relies on only the most basic parts of the `troposphere`
API.

## 1.1.1 (2017-10-11)

This release is mostly about updating the dependencies for stacker to newer
versions, since that was missed in the last release.

## 1.1.0 (2017-10-08)

- `--max-zones` removed from CLI [GH-427]
- Ami lookup: add region specification [GH-433]
- DynamoDB Lookup [GH-434]
- Environment file is optional now [GH-436]
- New functional test suite [GH-439]
- Structure config object using Schematics [GH-443]
- S3 endpoint fallback [GH-445]
- Stack specific tags [GH-450]
- Allow disabling of stacker bucket (direct CF updates) [GH-451]
- Uniform deprecation warnings [GH-452]
- Remote configuration support [GH-458]
- TroposphereType updates [GH-462]
- Fix replacements-only issue [GH-464]
- testutil enhancments to blueprint testing [GH-467]
- Removal of Interactive Provider (now combined w/ default provider) [GH-469]
- protected stacks [GH-472]
- MUCH Better handling of stack rollbacks & recreations [GH-473]
- follow\_symlinks argument for aws lambda hook [GH-474]
- Enable service\_role for cloudformation operations [GH-476]
- Allow setting stack description from config [GH-477]
- Move S3 templates into sub-directories [GH-478]

## 1.0.4 (2017-07-07)

- Fix issue w/ tail being required (but not existing) on diff/info/etc [GH-429]

## 1.0.3 (2017-07-06)

There was some reworking on how regions are handled, specifically around
s3 and where the buckets for both stacker and the awslambda lookup are created.
Now the stacker bucket will default to being created in the region where the
stacks are being created (ie: from the `--region` argument). If you want to
have the bucket be in a different region you now can set the
`stacker_bucket_region` top level config value.

For the awslambda hook, you also have the option of using `bucket_region` as
an argument, provided you are using a custom `bucket` for the hook. If you
are not using a custom bucket, then it will use the logic used above.

- add ami lookup [GH-360]
- Add support for Property objects in TroposphereType variables [GH-379]
- Add debugging statements to sys.path appending [GH-385]
- Catch undefined variable value [GH-388]
- Exponential backoff waiting for AWS changeset to stabilize [GH-389]
- Add parameter changes to diff output [GH-394]
- Add CODE\_OF\_CONDUCT.md [GH-399]
- Add a hint for forbidden bucket access [GH-401]
- Fix issues w/ "none" as variable values [GH-405]
- Remove extra '/' in blueprint tests [GH-409]
- Fix dump provider interaction with lookups [GH-410]
- Add ssmstore lookup docs [GH-411]
- Fix issue w/ s3 buckets in different regions [GH-413, GH-417]
- Disable loop logger whe --tail is provided [GH-414]
- Add envvar lookup [GH-418]

## 1.0.2 (2017-05-10)

- fix lambda hook determinism [GH-372]
- give lambda hook ability to upload to a prefix [GH-376]
- fix bad argument for approval in interactive provider [GH-381]

## 1.0.1 (2017-04-24)

- rxref lookup [GH-328]
- Cleaned up raise statement in blueprints [GH-348]
- Fix missing default provider for build\_parameters [GH-353]
- Setup codecov [GH-354]
- Added blueprint testing harness [GH-362]
- context hook\_data lookup [GH-366]

## 1.0.0 (2017-03-04)

This is a major release with the main change being the removal of the old
Parameters logic in favor of Blueprint Variables and Lookups.

- Add support for resolving variables when calling `dump`[GH-231]
- Remove old Parameters code [GH-232]
- Pass Context & Provider to hooks [GH-233]
- Fix Issue w/ Dump [GH-241]
- Support `allowed_values` within variable definitions [GH-245]
- Fix filehandler lookups with pseudo parameters [GH-247]
- keypair hook update to match route53 update [GH-248]
- Add support for `TroposphereType` [GH-249]
- Allow = in lookup contents [GH-251]
- Add troposphere types [GH-257]
- change capabilities to CAPABILITY\_NAMED\_IAM [GH-262]
- Disable transformation of variables [GH-266]
- Support destroying a subset of stacks [GH-278]
- Update all hooks to use advanced results [GH-285]
- Use sys\_path for hooks and lookups [GH-286]
- Remove last of botocore connections [GH-287]
- Remove --var flag [GH-289]
- Avoid dictionary sharing pollution [GH-293]
- Change aws\_lambda hook handler to use proper parameters [GH-297]
- New `split` lookup handler [GH-302]
- add parse\_user\_data [GH-306]
- Add credential caching [GH-307]
- Require explicit call to `output` lookup [GH-310]
- Convert booleans to strings for CFNTypes [GH-311]
- Add ssmstore as a lookup type [GH-314]
- Added region to the ssm store test client [GH-316]
- Add default lookup [GH-317]
- Clean up errors from variables [GH-319]

## 0.8.6 (2017-01-26)

- Support destroying subset of stacks [GH-278]
- Update all hooks to use advanced results [GH-285]
- Use sys\_path for hooks and lookups [GH-286]
- Remove last of botocore conns [GH-287]
- Avoid dictionary sharing pollution [GH-293]

## 0.8.5 (2016-11-28)

- Allow `=` in lookup input [GH-251]
- Add hook for uploading AWS Lambda functions [GH-252]
- Upgrade hard coded capabilities to include named IAM [GH-262]
- Allow hooks to return results that can be looked up later [GH-270]

## 0.8.4 (2016-11-01)

- Fix an issue w/ boto3 version string not working with older setuptools

## 0.8.3 (2016-10-31)

- pass context to hooks as a kwarg [GH-234]
- Fix file handler lookups w/ pseudo parameters [GH-239]
- Allow use of later boto3 [GH-253]

## 0.8.1 (2016-09-22)

Minor update to remove dependencies on stacker\_blueprints for tests, since it
resulted in a circular dependency.  This is just a fix to get tests running again,
and results in no change in functionality.

## 0.8.0 (2016-09-22)

This is a big release which introduces the new concepts of Blueprint Variables
and Lookups. A lot of folks contributed to this release - in both code, and just
testing of the new features.  Thanks to:

@kylev, @oliviervg1, @datadotworld, @acmcelwee, @troyready, @danielkza, and @ttarhan

Special thanks to @mhahn who did the bulk of the heavy lifting in this release, and
the work towards 1.0!

- Add docs on config, environments & translators [GH-157]
- locked output changed to debug [GH-159]
- Multi-output parameter doc [GH-160]
- Remove spaces from multi-item parameters [GH-161]
- Remove blueprints & configs in favor of stacker\_blueprints [GH-163]
- Clean up plan/status split [GH-165]
- Allow s3 server side encryption [GH-167]
- Support configurable namespace delimiter [GH-169]
- Support tags as a new top-level keyword [GH-171]
- Update to boto3 [GH-174]
- Interactive AWS Provider [GH-178]
- Add config option for appending to sys.path [GH-179]
- More condensed output [GH-182]
- File loading lookup [GH-185]
- Handle stacks without parameters [GH-193]
- Implement blueprint variables & lookups [GH-194]
- Fix traceback on interactive provider when adding resources [GH-198]
- kms lookup [GH-200]
- Compatible release version dependencies [GH-201]
- add xref lookup [GH-202]
- Update docstrings for consistency [GH-204]
- Add support for CFN Parameter types in Blueprint Variables [GH-206]
- Deal w/ multiprocessing library sharing ssl connections [GH-208]
- Fix issues with slashes inside variable lookups [GH-213]
- Custom validators for blueprint variables [GH-218]

## 0.6.3 (2016-05-24)
- add `stacker dump` subcommand for testing stack/blueprints [GH-156]

## 0.6.2 (2016-05-17)
- Allow users to override name of bucket to store templates [GH-145]
- Add support for passing environment variables on the cli via --env [GH-148]
- Cleanup output on non-verbose runs [GH-153]
- Added `compare_env` command, for easier comparing of environment files [GH-155]

## 0.6.1 (2016-02-11)
- Add support for the 'stacker diff' command [GH-133]
- Python boolean parameters automatically converted to strings for CloudFormation [GH-136]
- No longer require mappings in config [GH-140]
- Skipped steps now include a reason [GH-141]

## 0.6.0 (2016-01-07)

- Support tailing cloudformation event stream when building/destroying stacks [GH-90]
- More customizable ASG userdata & options [GH-100]
- Deprecate 'blueprints' in favor of 'stacker\_blueprints' package [GH-125]
- Add KMS based encryption translator [GH-126]
- Fix typo in ASG customization [GH-127]
- Allow file:// prefix with KMS encryption translator [GH-128]
- No longer require a confirmation if the user passes the `--force` flag when destroying [GH-131]

## 0.5.4 (2015-12-03)

- Fix memory leak issue (GH-111) [GH-114]
- Add enabled flag to stacks [GH-115]
- Add support for List<AWS::EC2::*> parameters [GH-117]
- Add eu-west-1 support for empire [GH-116]
- Move get\_fqn to a function, add tests [GH-119]
- Add new postgres versions (9.4.4, 9.4.5) [GH-121]
- Handle blank parameter values [GH-120]

## 0.5.3 (2015-11-03)

- Add --version [GH-91]
- Simplify environment file to key: value, rather than YAML [GH-94]
- Ensure certificate exists hook [GH-94]
- Ensure keypair exists hook [GH-99]
- Custom field constructors & vault encryption [GH-95]
- DBSnapshotIdentifier to RDS blueprints [GH-105]
- Empire ECS Agent telemetry support fixes, use new Empire AMI [GH-107]
- Remove stack tags [GH-110]

## 0.5.2 (2015-09-10)

- Add Dockerfile/image [GH-87]
- Clean up environment docs [GH-88]
- Make StorageType configurable in RDS v2 [GH-92]

## 0.5.1 (2015-09-08)

- Add info subcommand [GH-73]
- Move namespace into environment [GH-72]
- Simplified basecommand [GH-74]
- Documentation updates [GH-75, GH-77, GH-78]
- aws\_helper removal [GH-79]
- Move VPC to use LOCAL\_PARAMETERS [GH-81]
- Lower default AZ count to 2 [GH-82]
- Allow use of all parameter properties [GH-83]
- Parameter gathering in method [GH-84]
- NoEcho on sensitive parameters in blueprnts [GH-85]
- Version 2 RDS Blueprints [GH-86]

## 0.5.0 (2015-08-13)

- stacker subcommands [GH-35]
- Added Empire production stacks [GH-43]
  - Major change in internal code layout & added testing
- added destroy subcommand [GH-59]
- Local Blueprint Parameters [GH-61]
- Lockable stacks [GH-62]
- Deal with Cloudformation API throttling [GH-64]
- Clarify Remind's usage of stacker in README [GH-70]

## 0.4.1 (2015-07-23)

- Stack Specific Parameters [GH-32]
- Random fixes & cleanup [GH-34]
- Handle skipped rollbacks [GH-36]
- Internal zone detection [GH-39]
- Internal hostname conditional [GH-40]
- Empire production stacks [GH-43]

## 0.4.0 (2015-05-13)

- Optional internal DNS Zone on vpc blueprint [GH-29]
- Add environment concept [GH-27]
- Optional internal zone cname for rds databases [GH-30]

## 0.3.0 (2015-05-05)

- remove auto-subnet splitting in vpc stack (GH-25)
- create bucket in correct region (GH-17, GH-23)
- asg sets optionally sets up ELB w/ (optional) SSL
- Remove DNS core requirement, add plugin/hook system (GH-26)

## 0.2.2 (2015-03-31)

- Allow AWS to generate the DBInstanceIdentifier

## 0.2.1 (2015-03-31)
- Bah, typo in version string, fixing

## 0.2.0 (2015-03-31)

- New taxonomy (GH-18)
- better setup.py (GH-16) - thanks mhahn
- Use exitsing parameters (GH-20)
- Able to work on subset of stacks (GH-14)
- Config cleanup (GH-9)


================================================
FILE: CODE_OF_CONDUCT.md
================================================
# Contributor Covenant Code of Conduct

## Our Pledge

In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race, religion, or sexual identity and orientation.

## Our Standards

Examples of behavior that contributes to creating a positive environment include:

* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members

Examples of unacceptable behavior by participants include:

* The use of sexualized language or imagery and unwelcome sexual attention or advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a professional setting

## Our Responsibilities

Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior.

Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.

## Scope

This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers.

## Enforcement

Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at cloudtools-maintainers@groups.google.com. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.

Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership.

## Attribution

This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, available at [http://contributor-covenant.org/version/1/4][version]

[homepage]: http://contributor-covenant.org
[version]: http://contributor-covenant.org/version/1/4/


================================================
FILE: CONTRIBUTING.md
================================================
# Contributing

Contributions are welcome, and they are greatly appreciated!

You can contribute in many ways:

## Types of Contributions

### Report Bugs

Report bugs at https://github.com/cloudtools/stacker/issues.

If you are reporting a bug, please include:

* Your operating system name and version.
* Any details about your local setup that might be helpful in troubleshooting.
* Detailed steps to reproduce the bug.

### Fix Bugs

Look through the GitHub issues for bugs. Anything tagged with "bug"
is open to whoever wants to implement it.

### Implement Features

Look through the GitHub issues for features. Anything tagged with "feature"
is open to whoever wants to implement it.

### Write Documentation

stacker could always use more documentation, whether as part of the
official stacker docs, in docstrings, or even on the web in blog posts,
articles, and such.

Note: We use Google style docstrings (http://sphinxcontrib-napoleon.readthedocs.io/en/latest/example\_google.html)

### Submit Feedback

The best way to send feedback is to file an issue at https://github.com/cloudtools/stacker/issues.

If you are proposing a feature:

* Explain in detail how it would work.
* Keep the scope as narrow as possible, to make it easier to implement.
* Remember that this is a volunteer-driven project, and that contributions
  are welcome :)


## Get Started!

Ready to contribute? Here's how to set up `stacker` for local development.

1. Fork the `stacker` repo on GitHub.
2. Clone your fork locally:

    ```console
    $ git clone git@github.com:your_name_here/stacker.git
    ```

3. Install your local copy into a virtualenv. Assuming you have virtualenvwrapper installed, this is how you set up your fork for local development:

    ```console
    $ mkvirtualenv stacker
    $ cd stacker/
    $ python setup.py develop
    ```

4. Create a branch for local development:

    ```console
    $ git checkout -b name-of-your-bugfix-or-feature
    ```

   Now you can make your changes locally.

5. When you're done making changes, check that your changes pass flake8 and the tests, including testing other Python versions with tox:

    ```console
    $ make test
    ```

   To get flake8 just pip install it into your virtualenv.

6. Commit your changes and push your branch to GitHub:

    ```console
    $ git add .
    $ git commit -m "Your detailed description of your changes."
    $ git push origin name-of-your-bugfix-or-feature
    ```

7. Submit a pull request through the GitHub website.

For information about the functional testing suite, see [tests/README.md](./tests).

## Pull Request Guidelines

Before you submit a pull request, check that it meets these guidelines:

1. The pull request should include tests.
2. If the pull request adds functionality, the docs should be updated. (See `Write Documentation` above for guidelines)
3. The pull request should work for Python 2.7 and for PyPy. Check
   https://circleci.com/gh/cloudtools/stacker and make sure that the tests pass for all supported Python versions.
4. Please update the `Upcoming/Master` section of the [CHANGELOG](./CHANGELOG.md) with a small bullet point about the change.


================================================
FILE: Dockerfile
================================================
FROM python:2.7.10
MAINTAINER Mike Barrett

COPY scripts/docker-stacker /bin/docker-stacker
RUN mkdir -p /stacks && pip install --upgrade pip setuptools
WORKDIR /stacks
COPY . /tmp/stacker
RUN pip install --upgrade pip
RUN pip install --upgrade setuptools
RUN cd /tmp/stacker && python setup.py install && rm -rf /tmp/stacker

ENTRYPOINT ["docker-stacker"]
CMD ["-h"]


================================================
FILE: LICENSE
================================================
Copyright (c) 2015, Remind101, Inc.
All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:

1. Redistributions of source code must retain the above copyright notice, this
   list of conditions and the following disclaimer.

2. Redistributions in binary form must reproduce the above copyright notice,
   this list of conditions and the following disclaimer in the documentation
   and/or other materials provided with the distribution.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.


================================================
FILE: Makefile
================================================
.PHONY: build lint test-unit test-functional test

build:
	docker build -t remind101/stacker .

lint:
	flake8 --ignore E402,W503,W504,W605,N818 --exclude stacker/tests/ stacker
	flake8 --ignore E402,N802,W605,N818 stacker/tests # ignore setUp naming

test-unit: clean
	python setup.py test

test-unit3: clean
	python3 setup.py test

clean:
	rm -rf .egg stacker.egg-info

test-functional:
	cd tests && bats test_suite

# General testing target for most development.
test: lint test-unit test-unit3

apidocs:
	sphinx-apidoc --force -o docs/api stacker


================================================
FILE: README.rst
================================================
=======
stacker
=======

.. image:: https://readthedocs.org/projects/stacker/badge/?version=latest
   :target: http://stacker.readthedocs.org/en/latest/

.. image:: https://circleci.com/gh/cloudtools/stacker.svg?style=shield
   :target: https://circleci.com/gh/cloudtools/stacker

.. image:: https://empire-slack.herokuapp.com/badge.svg
   :target: https://empire-slack.herokuapp.com

.. image:: https://badge.fury.io/py/stacker.svg
   :target: https://badge.fury.io/py/stacker

.. image:: https://landscape.io/github/cloudtools/stacker/master/landscape.svg?style=flat
   :target: https://landscape.io/github/cloudtools/stacker/master
   :alt: Code Health

.. image:: https://codecov.io/gh/cloudtools/stacker/branch/master/graph/badge.svg
   :target: https://codecov.io/gh/cloudtools/stacker
   :alt: codecov


For full documentation, please see the readthedocs_ site.

`Click here to join the Slack team`_ for stacker, and then join the #stacker
channel!

About
=====

stacker is a tool and library used to create & update multiple CloudFormation
stacks. It was originally written at Remind_ and
released to the open source community.

stacker Blueprints are written in troposphere_, though the purpose of
most templates is to keep them as generic as possible and then use
configuration to modify them.

At Remind we use stacker to manage all of our Cloudformation stacks -
both in development, staging, and production without any major issues.

Requirements
============

* Python 3.7+

Stacker Command
===============

The ``stacker`` command has sub-commands, similar to git.

Here are some examples:

  ``build``:
    handles taking your stack config and then launching or updating stacks as necessary.

  ``destroy``:
    tears down your stacks

  ``diff``:
    compares your currently deployed stack templates to your config files

  ``info``:
    prints information about your currently deployed stacks

We document these sub-commands in full along with others, in the documentation.


Getting Started
===============

``stacker_cookiecutter``: https://github.com/cloudtools/stacker_cookiecutter

  We recommend creating your base `stacker` project using ``stacker_cookiecutter``.
  This tool will install all the needed dependencies and created the project
  directory structure and files. The resulting files are well documented
  with comments to explain their purpose and examples on how to extend.

``stacker_blueprints``: https://github.com/cloudtools/stacker_blueprints

  This repository holds working examples of ``stacker`` blueprints.
  Each blueprint works in isolation and may be referenced, extended, or
  copied into your project files. The blueprints are written in Python
  and use the troposphere_ library.

``stacker reference documentation``:

  We document all functionality and features of stacker in our extensive
  reference documentation located at readthedocs_.

``AWS OSS Blog``: https://aws.amazon.com/blogs/opensource/using-aws-codepipeline-and-open-source-tools-for-at-scale-infrastructure-deployment/

  The AWS OSS Blog has a getting started guide using stacker with AWS CodePipeline.


Docker
======

Stack can also be executed from Docker. Use this method to run stacker if you
want to avoid setting up a python environment::

  docker run -it -v `pwd`:/stacks remind101/stacker build ...

.. _Remind: http://www.remind.com/
.. _troposphere: https://github.com/cloudtools/troposphere
.. _string.Template: https://docs.python.org/2/library/string.html#template-strings
.. _readthedocs: http://stacker.readthedocs.io/en/latest/
.. _`Click here to join the Slack team`: https://empire-slack.herokuapp.com


================================================
FILE: RELEASE.md
================================================
# Steps to release a new version

## Preparing for the release

- Check out a branch named for the version: `git checkout -b release-1.1.1`
- Change version in setup.py and stacker/\_\_init\_\_.py
- Update CHANGELOG.md with changes made since last release (see below for helpful
  command)
- add changed files: `git add setup.py stacker/\_\_init\_\_.py CHANGELOG.md`
- Commit changes: `git commit -m "Release 1.1.1"`
- Create a signed tag: `git tag --sign -m "Release 1.1.1" 1.1.1`
- Push branch up to git: `git push -u origin release-1.1.1`
- Open a PR for the release, ensure that tests pass

## Releasing

- Push tag: `git push --tags`
- Merge PR into master, checkout master locally: `git checkout master; git pull`
- Create PyPI release: `python setup.py sdist upload --sign`
- Update github release page: https://github.com/cloudtools/stacker/releases 
  - use the contents of the latest CHANGELOG entry for the body.

# Helper to create CHANGELOG entries
git log --reverse --pretty=format:"%s" | tail -100 | sed 's/^/- /'


================================================
FILE: codecov.yml
================================================
comment: false


================================================
FILE: conf/README.rst
================================================
Please check out the stacker_blueprints_ repo for example configs and 
blueprints.

.. _stacker_blueprints: https://github.com/cloudtools/stacker_blueprints


================================================
FILE: docs/.gitignore
================================================
_build


================================================
FILE: docs/Makefile
================================================
# Makefile for Sphinx documentation
#

# You can set these variables from the command line.
SPHINXOPTS    =
SPHINXBUILD   = python -m sphinx
PAPER         =
BUILDDIR      = _build

# Internal variables.
PAPEROPT_a4     = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS   = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS  = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .

.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest coverage gettext

help:
	@echo "Please use \`make <target>' where <target> is one of"
	@echo "  html       to make standalone HTML files"
	@echo "  serve      to run a webserver in the html dir (0.0.0.0:8000)"
	@echo "  dirhtml    to make HTML files named index.html in directories"
	@echo "  singlehtml to make a single large HTML file"
	@echo "  pickle     to make pickle files"
	@echo "  json       to make JSON files"
	@echo "  htmlhelp   to make HTML files and a HTML help project"
	@echo "  qthelp     to make HTML files and a qthelp project"
	@echo "  applehelp  to make an Apple Help Book"
	@echo "  devhelp    to make HTML files and a Devhelp project"
	@echo "  epub       to make an epub"
	@echo "  latex      to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
	@echo "  latexpdf   to make LaTeX files and run them through pdflatex"
	@echo "  latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
	@echo "  text       to make text files"
	@echo "  man        to make manual pages"
	@echo "  texinfo    to make Texinfo files"
	@echo "  info       to make Texinfo files and run them through makeinfo"
	@echo "  gettext    to make PO message catalogs"
	@echo "  changes    to make an overview of all changed/added/deprecated items"
	@echo "  xml        to make Docutils-native XML files"
	@echo "  pseudoxml  to make pseudoxml-XML files for display purposes"
	@echo "  linkcheck  to check all external links for integrity"
	@echo "  doctest    to run all doctests embedded in the documentation (if enabled)"
	@echo "  coverage   to run coverage check of the documentation (if enabled)"

clean:
	rm -rf $(BUILDDIR)/*

html:
	$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
	@echo
	@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."


serve:
	cd $(BUILDDIR)/html/ && python -m SimpleHTTPServer

dirhtml:
	$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
	@echo
	@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."

singlehtml:
	$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
	@echo
	@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."

pickle:
	$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
	@echo
	@echo "Build finished; now you can process the pickle files."

json:
	$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
	@echo
	@echo "Build finished; now you can process the JSON files."

htmlhelp:
	$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
	@echo
	@echo "Build finished; now you can run HTML Help Workshop with the" \
	      ".hhp project file in $(BUILDDIR)/htmlhelp."

qthelp:
	$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
	@echo
	@echo "Build finished; now you can run "qcollectiongenerator" with the" \
	      ".qhcp project file in $(BUILDDIR)/qthelp, like this:"
	@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/stacker.qhcp"
	@echo "To view the help file:"
	@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/stacker.qhc"

applehelp:
	$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
	@echo
	@echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
	@echo "N.B. You won't be able to view it unless you put it in" \
	      "~/Library/Documentation/Help or install it in your application" \
	      "bundle."

devhelp:
	$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
	@echo
	@echo "Build finished."
	@echo "To view the help file:"
	@echo "# mkdir -p $$HOME/.local/share/devhelp/stacker"
	@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/stacker"
	@echo "# devhelp"

epub:
	$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
	@echo
	@echo "Build finished. The epub file is in $(BUILDDIR)/epub."

latex:
	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
	@echo
	@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
	@echo "Run \`make' in that directory to run these through (pdf)latex" \
	      "(use \`make latexpdf' here to do that automatically)."

latexpdf:
	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
	@echo "Running LaTeX files through pdflatex..."
	$(MAKE) -C $(BUILDDIR)/latex all-pdf
	@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."

latexpdfja:
	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
	@echo "Running LaTeX files through platex and dvipdfmx..."
	$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
	@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."

text:
	$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
	@echo
	@echo "Build finished. The text files are in $(BUILDDIR)/text."

man:
	$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
	@echo
	@echo "Build finished. The manual pages are in $(BUILDDIR)/man."

texinfo:
	$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
	@echo
	@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
	@echo "Run \`make' in that directory to run these through makeinfo" \
	      "(use \`make info' here to do that automatically)."

info:
	$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
	@echo "Running Texinfo files through makeinfo..."
	make -C $(BUILDDIR)/texinfo info
	@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."

gettext:
	$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
	@echo
	@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."

changes:
	$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
	@echo
	@echo "The overview file is in $(BUILDDIR)/changes."

linkcheck:
	$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
	@echo
	@echo "Link check complete; look for any errors in the above output " \
	      "or in $(BUILDDIR)/linkcheck/output.txt."

doctest:
	$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
	@echo "Testing of doctests in the sources finished, look at the " \
	      "results in $(BUILDDIR)/doctest/output.txt."

coverage:
	$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
	@echo "Testing of coverage in the sources finished, look at the " \
	      "results in $(BUILDDIR)/coverage/python.txt."

xml:
	$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
	@echo
	@echo "Build finished. The XML files are in $(BUILDDIR)/xml."

pseudoxml:
	$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
	@echo
	@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."


================================================
FILE: docs/api/modules.rst
================================================
stacker
=======

.. toctree::
   :maxdepth: 4

   stacker


================================================
FILE: docs/api/stacker.actions.rst
================================================
stacker\.actions package
========================

Submodules
----------

stacker\.actions\.base module
-----------------------------

.. automodule:: stacker.actions.base
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.actions\.build module
------------------------------

.. automodule:: stacker.actions.build
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.actions\.destroy module
--------------------------------

.. automodule:: stacker.actions.destroy
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.actions\.diff module
-----------------------------

.. automodule:: stacker.actions.diff
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.actions\.info module
-----------------------------

.. automodule:: stacker.actions.info
    :members:
    :undoc-members:
    :show-inheritance:


Module contents
---------------

.. automodule:: stacker.actions
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/api/stacker.blueprints.rst
================================================
stacker\.blueprints package
===========================

Subpackages
-----------

.. toctree::

    stacker.blueprints.variables

Submodules
----------

stacker\.blueprints\.base module
--------------------------------

.. automodule:: stacker.blueprints.base
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.blueprints\.testutil module
------------------------------------

.. automodule:: stacker.blueprints.testutil
    :members:
    :undoc-members:
    :show-inheritance:


Module contents
---------------

.. automodule:: stacker.blueprints
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/api/stacker.blueprints.variables.rst
================================================
stacker\.blueprints\.variables package
======================================

Submodules
----------

stacker\.blueprints\.variables\.types module
--------------------------------------------

.. automodule:: stacker.blueprints.variables.types
    :members:
    :undoc-members:
    :show-inheritance:


Module contents
---------------

.. automodule:: stacker.blueprints.variables
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/api/stacker.commands.rst
================================================
stacker\.commands package
=========================

Subpackages
-----------

.. toctree::

    stacker.commands.stacker

Module contents
---------------

.. automodule:: stacker.commands
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/api/stacker.commands.stacker.rst
================================================
stacker\.commands\.stacker package
==================================

Submodules
----------

stacker\.commands\.stacker\.base module
---------------------------------------

.. automodule:: stacker.commands.stacker.base
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.commands\.stacker\.build module
----------------------------------------

.. automodule:: stacker.commands.stacker.build
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.commands\.stacker\.destroy module
------------------------------------------

.. automodule:: stacker.commands.stacker.destroy
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.commands\.stacker\.diff module
---------------------------------------

.. automodule:: stacker.commands.stacker.diff
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.commands\.stacker\.info module
---------------------------------------

.. automodule:: stacker.commands.stacker.info
    :members:
    :undoc-members:
    :show-inheritance:


Module contents
---------------

.. automodule:: stacker.commands.stacker
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/api/stacker.config.rst
================================================
stacker\.config package
=======================

Subpackages
-----------

.. toctree::

    stacker.config.translators

Module contents
---------------

.. automodule:: stacker.config
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/api/stacker.config.translators.rst
================================================
stacker\.config\.translators package
====================================

Submodules
----------

stacker\.config\.translators\.kms module
----------------------------------------

.. automodule:: stacker.config.translators.kms
    :members:
    :undoc-members:
    :show-inheritance:


Module contents
---------------

.. automodule:: stacker.config.translators
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/api/stacker.hooks.rst
================================================
stacker\.hooks package
======================

Submodules
----------

stacker\.hooks\.aws\_lambda module
----------------------------------

.. automodule:: stacker.hooks.aws_lambda
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.hooks\.ecs module
--------------------------

.. automodule:: stacker.hooks.ecs
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.hooks\.iam module
--------------------------

.. automodule:: stacker.hooks.iam
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.hooks\.keypair module
------------------------------

.. automodule:: stacker.hooks.keypair
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.hooks\.route53 module
------------------------------

.. automodule:: stacker.hooks.route53
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.hooks\.utils module
----------------------------

.. automodule:: stacker.hooks.utils
    :members:
    :undoc-members:
    :show-inheritance:


Module contents
---------------

.. automodule:: stacker.hooks
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/api/stacker.logger.rst
================================================
stacker\.logger package
=======================

Submodules
----------

stacker\.logger\.formatter module
---------------------------------

.. automodule:: stacker.logger.formatter
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.logger\.handler module
-------------------------------

.. automodule:: stacker.logger.handler
    :members:
    :undoc-members:
    :show-inheritance:


Module contents
---------------

.. automodule:: stacker.logger
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/api/stacker.lookups.handlers.rst
================================================
stacker\.lookups\.handlers package
==================================

Submodules
----------

stacker\.lookups\.handlers\.ami module
--------------------------------------

.. automodule:: stacker.lookups.handlers.ami
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.lookups\.handlers\.default module
------------------------------------------

.. automodule:: stacker.lookups.handlers.default
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.lookups\.handlers\.dynamodb module
-------------------------------------------

.. automodule:: stacker.lookups.handlers.dynamodb
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.lookups\.handlers\.envvar module
-----------------------------------------

.. automodule:: stacker.lookups.handlers.envvar
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.lookups\.handlers\.file module
---------------------------------------

.. automodule:: stacker.lookups.handlers.file
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.lookups\.handlers\.hook\_data module
---------------------------------------------

.. automodule:: stacker.lookups.handlers.hook_data
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.lookups\.handlers\.kms module
--------------------------------------

.. automodule:: stacker.lookups.handlers.kms
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.lookups\.handlers\.output module
-----------------------------------------

.. automodule:: stacker.lookups.handlers.output
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.lookups\.handlers\.rxref module
----------------------------------------

.. automodule:: stacker.lookups.handlers.rxref
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.lookups\.handlers\.split module
----------------------------------------

.. automodule:: stacker.lookups.handlers.split
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.lookups\.handlers\.ssmstore module
-------------------------------------------

.. automodule:: stacker.lookups.handlers.ssmstore
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.lookups\.handlers\.xref module
---------------------------------------

.. automodule:: stacker.lookups.handlers.xref
    :members:
    :undoc-members:
    :show-inheritance:


Module contents
---------------

.. automodule:: stacker.lookups.handlers
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/api/stacker.lookups.rst
================================================
stacker\.lookups package
========================

Subpackages
-----------

.. toctree::

    stacker.lookups.handlers

Submodules
----------

stacker\.lookups\.registry module
---------------------------------

.. automodule:: stacker.lookups.registry
    :members:
    :undoc-members:
    :show-inheritance:


Module contents
---------------

.. automodule:: stacker.lookups
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/api/stacker.providers.aws.rst
================================================
stacker\.providers\.aws package
===============================

Submodules
----------

stacker\.providers\.aws\.default module
---------------------------------------

.. automodule:: stacker.providers.aws.default
    :members:
    :undoc-members:
    :show-inheritance:


Module contents
---------------

.. automodule:: stacker.providers.aws
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/api/stacker.providers.rst
================================================
stacker\.providers package
==========================

Subpackages
-----------

.. toctree::

    stacker.providers.aws

Submodules
----------

stacker\.providers\.base module
-------------------------------

.. automodule:: stacker.providers.base
    :members:
    :undoc-members:
    :show-inheritance:


Module contents
---------------

.. automodule:: stacker.providers
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/api/stacker.rst
================================================
stacker package
===============

Subpackages
-----------

.. toctree::

    stacker.actions
    stacker.blueprints
    stacker.commands
    stacker.config
    stacker.hooks
    stacker.logger
    stacker.lookups
    stacker.providers
    stacker.tests

Submodules
----------

stacker\.context module
-----------------------

.. automodule:: stacker.context
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.environment module
---------------------------

.. automodule:: stacker.environment
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.exceptions module
--------------------------

.. automodule:: stacker.exceptions
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.plan module
--------------------

.. automodule:: stacker.plan
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.session\_cache module
------------------------------

.. automodule:: stacker.session_cache
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.stack module
---------------------

.. automodule:: stacker.stack
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.status module
----------------------

.. automodule:: stacker.status
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.tokenize\_userdata module
----------------------------------

.. automodule:: stacker.tokenize_userdata
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.util module
--------------------

.. automodule:: stacker.util
    :members:
    :undoc-members:
    :show-inheritance:

stacker\.variables module
-------------------------

.. automodule:: stacker.variables
    :members:
    :undoc-members:
    :show-inheritance:


Module contents
---------------

.. automodule:: stacker
    :members:
    :undoc-members:
    :show-inheritance:


================================================
FILE: docs/blueprints.rst
================================================
==========
Blueprints
==========

Blueprints are python classes that dynamically build CloudFormation templates. Where
you would specify a raw Cloudformation template in a stack using the ``template_path`` key,
you instead specify a blueprint python file using the ``class_path`` key.

Traditionally blueprints are built using troposphere_, but that is not absolutely
necessary. You are encouraged to check out the library of publicly shared
Blueprints in the stacker_blueprints_ package.

Making your own should be easy, and you can take a lot of examples from
stacker_blueprints_. In the end, all that is required is that the Blueprint
is a subclass of *stacker.blueprints.base* and it have the following methods:

.. code-block:: python

    # Initializes the blueprint
    def __init__(self, name, context, mappings=None):

    # Updates self.template to create the actual template
    def create_template(self):

    # Returns a tuple: (version, rendered_template)
    def render_template(self):

Variables
=========

A Blueprint can define a ``VARIABLES`` property that defines the variables
it accepts from the `Config Variables <config.html#variables>`_.

``VARIABLES`` should be a dictionary of ``<variable name>: <variable
definition>``. The variable definition should be a dictionary which
supports the following optional keys:

**type:**
  The type for the variable value. This can either be a native python
  type or one of the `Variable Types`_.

**default:**
  The default value that should be used for the variable if none is
  provided in the config.

**description:**
  A string that describes the purpose of the variable.

**validator:**
  An optional function that can do custom validation of the variable. A
  validator function should take a single argument, the value being validated,
  and should return the value if validation is successful. If there is an
  issue validating the value, an exception (``ValueError``, ``TypeError``, etc)
  should be raised by the function.

**no_echo:**
  Only valid for variables whose type subclasses ``CFNType``. Whether to
  mask the parameter value whenever anyone makes a call that describes the
  stack. If you set the value to true, the parameter value is masked with
  asterisks (*****).

**allowed_values:**
  Only valid for variables whose type subclasses ``CFNType``. The set of
  values that should be allowed for the CloudFormation Parameter.

**allowed_pattern:**
  Only valid for variables whose type subclasses ``CFNType``. A regular
  expression that represents the patterns you want to allow for the
  CloudFormation Parameter.

**max_length:**
  Only valid for variables whose type subclasses ``CFNType``. The maximum
  length of the value for the CloudFormation Parameter.

**min_length:**
  Only valid for variables whose type subclasses ``CFNType``. The minimum
  length of the value for the CloudFormation Parameter.

**max_value:**
  Only valid for variables whose type subclasses ``CFNType``. The max
  value for the CloudFormation Parameter.

**min_value:**
  Only valid for variables whose type subclasses ``CFNType``. The min
  value for the CloudFormation Parameter.

**constraint_description:**
  Only valid for variables whose type subclasses ``CFNType``. A string
  that explains the constraint when the constraint is violated for the
  CloudFormation Parameter.


Variable Types
==============

Any native python type can be specified as the ``type`` for a variable.
You can also use the following custom types:

TroposphereType
---------------

The ``TroposphereType`` can be used to generate resources for use in the
blueprint directly from user-specified configuration. Which case applies depends
on what ``type`` was chosen, and how it would be normally used in the blueprint
(and CloudFormation in general).

Resource Types
^^^^^^^^^^^^^^

When ``type`` is a `Resource Type`_, the value specified by the user in the
configuration file must be a dictionary, but with two possible structures.

When ``many`` is disabled, the top-level dictionary keys correspond to
parameters of the ``type`` constructor. The key-value pairs will be used
directly, and one object will be created and stored in the variable.

When ``many`` is enabled, the top-level dictionary *keys* are resource titles,
and the corresponding *values* are themselves dictionaries, to be used as
parameters for creating each of multiple ``type`` objects. A list of those
objects will be stored in the variable.

Property Types
^^^^^^^^^^^^^^

When ``type`` is a `Property Type`_ the value specified by the user in the
configuration file must be a dictionary or a list of dictionaries.

When ``many`` is disabled, the top-level dictionary keys correspond to
parameters of the ``type`` constructor. The key-value pairs will be used
directly, and one object will be created and stored in the variable.

When ``many`` is enabled, a list of dictionaries is expected. For each element,
one corresponding call will be made to the ``type`` constructor, and all the
objects produced will be stored (also as a list) in the variable.

Optional variables
^^^^^^^^^^^^^^^^^^

In either case, when ``optional`` is enabled, the variable may have no value
assigned, or be explicitly assigned a null value. When that happens the
variable's final value will be ``None``.

Example
^^^^^^^

Below is an annotated example:

.. code-block:: python

    from stacker.blueprints.base import Blueprint
    from stacker.blueprints.variables.types import TroposphereType
    from troposphere import s3, sns

    class Buckets(Blueprint):

        VARIABLES = {
            # Specify that Buckets will be a list of s3.Bucket types.
            # This means the config should a dictionary of dictionaries
            # which will be converted into troposphere buckets.
            "Buckets": {
                "type": TroposphereType(s3.Bucket, many=True),
                "description": "S3 Buckets to create.",
            },
            # Specify that only a single bucket can be passed.
            "SingleBucket": {
                "type": TroposphereType(s3.Bucket),
                "description": "A single S3 bucket",
            },
            # Specify that Subscriptions will be a list of sns.Subscription types.
            # Note: sns.Subscription is the property type, not the standalone
            # sns.SubscriptionResource.
            "Subscriptions": {
                "type": TroposphereType(sns.Subscription, many=True),
                "description": "Multiple SNS subscription designations"
            },
            # Specify that only a single subscription can be passed, and that it
            # is made optional.
            "SingleOptionalSubscription": {
                "type": TroposphereType(sns.Subscription, optional=True),
                "description": "A single, optional SNS subscription designation"
            }
        }

        def create_template(self):
            t = self.template
            variables = self.get_variables()

            # The Troposphere s3 buckets have already been created when we
            access variables["Buckets"], we just need to add them as
            resources to the template.
            [t.add_resource(bucket) for bucket in variables["Buckets"]]

            # Add the single bucket to the template. You can use
            `Ref(single_bucket)` to pass CloudFormation references to the
            bucket just as you would with any other Troposphere type.
            single_bucket = variables["SingleBucket"]
            t.add_resource(single_bucket)

            subscriptions = variables["Subscriptions"]
            optional_subscription = variables["SingleOptionalSubscription"]
            # Handle it in some special way...
            if optional_subscription is not None:
                subscriptions.append(optional_subscription)

            t.add_resource(sns.Topic(
                TopicName="one-test",
                Subscriptions=))

            t.add_resource(sns.Topic(
                TopicName="another-test",
                Subscriptions=subscriptions))



A sample config for the above:

..  code-block:: yaml

    stacks:
      - name: buckets
        class_path: path.to.above.Buckets
        variables:
          Buckets:
            # resource name (title) that will be added to CloudFormation.
            FirstBucket:
              # name of the s3 bucket
              BucketName: my-first-bucket
            SecondBucket:
              BucketName: my-second-bucket
          SingleBucket:
            # resource name (title) that will be added to CloudFormation.
            MySingleBucket:
              BucketName: my-single-bucket
          Subscriptions:
            - Endpoint: one-lambda
              Protocol: lambda
            - Endpoint: another-lambda
              Protocol: lambda
          # The following could be ommited entirely
          SingleOptionalSubscription:
            Endpoint: a-third-lambda
            Protocol: lambda


CFNType
-------

The ``CFNType`` can be used to signal that a variable should be submitted
to CloudFormation as a Parameter instead of only available to the
Blueprint when rendering. This is useful if you want to leverage AWS-
Specific Parameter types (e.g. ``List<AWS::EC2::Image::Id>``) or Systems
Manager Parameter Store values (e.g. ``AWS::SSM::Parameter::Value<String>``).
See ``stacker.blueprints.variables.types`` for available subclasses of the
``CFNType``.

Example
^^^^^^^

Below is an annotated example:

.. code-block:: python

    from stacker.blueprints.base import Blueprint
    from stacker.blueprints.variables.types import (
        CFNString,
        EC2AvailabilityZoneNameList,
    )


    class SampleBlueprint(Blueprint):

        VARIABLES = {
            "String": {
                "type": str,
                "description": "Simple string variable",
            },
            "List": {
                "type": list,
                "description": "Simple list variable",
            },
            "CloudFormationString": {
                "type": CFNString,
                "description": "A variable which will create a CloudFormation Parameter of type String",
            },
            "CloudFormationSpecificType": {
                "type": EC2AvailabilityZoneNameList,
                "description": "A variable which will create a CloudFormation Parameter of type List<AWS::EC2::AvailabilityZone::Name>"
            },
        }

        def create_template(self):
            t = self.template

            # `get_variables` returns a dictionary of <variable name>: <variable
            value>. For the subclasses of `CFNType`, the values are
            instances of `CFNParameter` which have a `ref` helper property
            which will return a troposphere `Ref` to the parameter name.
            variables = self.get_variables()

            t.add_output(Output("StringOutput", variables["String"]))

            # variables["List"] is a native list
            for index, value in enumerate(variables["List"]):
                t.add_output(Output("ListOutput:{}".format(index), value))


            # `CFNParameter` values (which wrap variables with a `type`
            that is a `CFNType` subclass) can be converted to troposphere
            `Ref` objects with the `ref` property
            t.add_output(Output("CloudFormationStringOutput",
                                variables["CloudFormationString"].ref))
            t.add_output(Output("CloudFormationSpecificTypeOutput",
                                variables["CloudFormationSpecificType"].ref))


Utilizing Stack name within your Blueprint
==========================================

Sometimes your blueprint might want to utilize the already existing stack name
within your blueprint. Stacker provides access to both the fully qualified
stack name matching what’s shown in the CloudFormation console, in addition to
the stacks short name you have set in your YAML config.

Referencing Fully Qualified Stack name
--------------------------------------

The fully qualified name is a combination of the Stacker namespace + the short
name (what you set as `name` in your YAML config file). If your stacker
namespace is `StackerIsCool` and the stacks short name is
`myAwesomeEC2Instance`, the fully qualified name would be:

``StackerIsCool-myAwesomeEC2Instance``

To use this in your blueprint, you can get the name from context. The
``self.context.get_fqn(self.name)``

Referencing the Stack short name
--------------------------------

The Stack short name is the name you specified for the stack within your YAML
config. It does not include the namespace. If your stacker namespace is
`StackerIsCool` and the stacks short name is `myAwesomeEC2Instance`, the
short name would be:

``myAwesomeEC2Instance``

To use this in your blueprint, you can get the name from self.name: ``self.name``

Example
^^^^^^^

Below is an annotated example creating a security group:

.. code-block:: python

  # we are importing Ref to allow for CFN References in the EC2 resource.  Tags
  # will be used to set the Name tag
  from troposphere import Ref, ec2, Tags
  from stacker.blueprints.base import Blueprint
  # CFNString is imported to allow for stand alone stack use
  from stacker.blueprints.variables.types import CFNString

  class SampleBlueprint(Blueprint):

    # VpcId set here to allow for blueprint to be reused
    VARIABLES = {
    "VpcId": {
        "type": CFNString,
        "description": "The VPC to create the Security group in",
        }
    }


    def create_template(self):
        template = self.template
        # Assigning the variables to a variable
        variables = self.get_variables()
        # now adding a SecurityGroup resource named `SecurityGroup` to the CFN template
        template.add_resource(
          ec2.SecurityGroup(
            "SecurityGroup",
            # Refering the VpcId set as the varible
            VpcId=variables['VpcId'].ref,
            # Setting the group description as the fully qualified name
            GroupDescription=self.context.get_fqn(self.name),
            # setting the Name tag to be the stack short name
            Tags=Tags(
              Name=self.name
              )
            )
          )


Testing Blueprints
==================

When writing your own blueprints its useful to write tests for them in order
to make sure they behave the way you expect they would, especially if there is
any complex logic inside.

To this end, a sub-class of the `unittest.TestCase` class has been
provided: `stacker.blueprints.testutil.BlueprintTestCase`. You use it
like the regular TestCase class, but it comes with an addition assertion:
`assertRenderedBlueprint`. This assertion takes a Blueprint object and renders
it, then compares it to an expected output, usually in
`tests/fixtures/blueprints`.

Examples of using the `BlueprintTestCase` class can be found in the
stacker_blueprints repo. For example, see the tests used to test the
`Route53 DNSRecords Blueprint`_ and the accompanying `output results`_:

Yaml (stacker) format tests
---------------------------

In order to wrap the `BlueprintTestCase` tests in a format similar to stacker's
stack format, the `YamlDirTestGenerator` class is provided. When subclassed in
a directory, it will search for yaml files in that directory with certain
structure and execute a test case for it. As an example:

.. code-block:: yaml

  ---
  namespace: test
  stacks:
    - name: test_stack
      class_path: stacker_blueprints.s3.Buckets
      variables:
        var1: val1

When run from tests, this will create a template fixture file called
test_stack.json containing the output from the `stacker_blueprints.s3.Buckets`
template.

Examples of using the `YamlDirTestGenerator` class can be found in the
stacker_blueprints repo. For example, see the tests used to test the
`s3.Buckets`_ class and the accompanying `fixture`_. These are
generated from a `subclass of YamlDirTestGenerator`_.

.. _troposphere: https://github.com/cloudtools/troposphere
.. _stacker_blueprints: https://github.com/cloudtools/stacker_blueprints
.. _Route53 DNSRecords Blueprint: https://github.com/cloudtools/stacker_blueprints/blob/master/tests/test_route53.py
.. _output results: https://github.com/cloudtools/stacker_blueprints/tree/master/tests/fixtures/blueprints
.. _Resource Type: https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-template-resource-type-ref.html
.. _Property Type: https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-product-property-reference.html
.. _s3.Buckets: https://github.com/cloudtools/stacker_blueprints/blob/master/tests/test_s3.yaml
.. _fixture: https://github.com/cloudtools/stacker_blueprints/blob/master/tests/fixtures/blueprints/s3_static_website.json
.. _subclass of YamlDirTestGenerator: https://github.com/cloudtools/stacker_blueprints/blob/master/tests/__init__.py


================================================
FILE: docs/commands.rst
================================================
========
Commands
========

Build
-----

Build is used to create/update the stacks provided in the config file. It
automatically figures out any dependencies between stacks, and creates them
in parallel safely (if a stack depends on another stack, it will wait for
that stack to be finished before updating/creating).

It also provides the *--dump* flag for testing out blueprints before
pushing them up into CloudFormation.
Even then, some errors might only be noticed after first submitting a stack,
at which point it can no longer be updated by Stacker.
When that situation is detected in interactive mode, you will be prompted to
delete and re-create the stack, so that you don't need to do it manually in the
AWS console.
If that behavior is also desired in non-interactive mode, enable the
*--recreate-failed* flag.

::

  # stacker build -h
  usage: stacker build [-h] [-e ENV=VALUE] [-r REGION] [-v] [-i]
                       [--replacements-only] [--recreate-failed] [-o]
                       [--force STACKNAME] [--stacks STACKNAME] [-t] [-d DUMP]
                       [environment] config

  Launches or updates CloudFormation stacks based on the given config. Stacker
  is smart enough to figure out if anything (the template or parameters) have
  changed for a given stack. If nothing has changed, stacker will correctly skip
  executing anything against the stack.

  positional arguments:
    environment           Path to a simple `key: value` pair environment file.
                          The values in the environment file can be used in the
                          stack config as if it were a string.Template type:
                          https://docs.python.org/2/library/string.html
                          #template-strings.
    config                The config file where stack configuration is located.
                          Must be in yaml format. If `-` is provided, then the
                          config will be read from stdin.

  optional arguments:
    -h, --help            show this help message and exit
    -e ENV=VALUE, --env ENV=VALUE
                          Adds environment key/value pairs from the command
                          line. Overrides your environment file settings. Can be
                          specified more than once.
    -r REGION, --region REGION
                          The AWS region to launch in.
    -v, --verbose         Increase output verbosity. May be specified up to
                          twice.
    -i, --interactive     Enable interactive mode. If specified, this will use
                          the AWS interactive provider, which leverages
                          Cloudformation Change Sets to display changes before
                          running cloudformation templates. You'll be asked if
                          you want to execute each change set. If you only want
                          to authorize replacements, run with "--replacements-
                          only" as well.
    --replacements-only   If interactive mode is enabled, stacker will only
                          prompt to authorize replacements.
    --recreate-failed     Destroy and re-create stacks that are stuck in a
                          failed state from an initial deployment when updating.
    -o, --outline         Print an outline of what steps will be taken to build
                          the stacks
    --force STACKNAME     If a stackname is provided to --force, it will be
                          updated, even if it is locked in the config.
    --stacks STACKNAME    Only work on the stacks given. Can be specified more
                          than once. If not specified then stacker will work on
                          all stacks in the config file.
    -t, --tail            Tail the CloudFormation logs while working with stacks
    -d DUMP, --dump DUMP  Dump the rendered Cloudformation templates to a
                          directory

Destroy
-------

Destroy handles the tearing down of CloudFormation stacks defined in the
config file. It figures out any dependencies that may exist, and destroys
the stacks in the correct order (in parallel if all dependent stacks have
already been destroyed).

::

  # stacker destroy -h
  usage: stacker destroy [-h] [-e ENV=VALUE] [-r REGION] [-v] [-i]
                         [--replacements-only] [-f] [--stacks STACKNAME] [-t]
                         environment config

  Destroys CloudFormation stacks based on the given config. Stacker will
  determine the order in which stacks should be destroyed based on any manual
  requirements they specify or output values they rely on from other stacks.

  positional arguments:
    environment           Path to a simple `key: value` pair environment file.
                          The values in the environment file can be used in the
                          stack config as if it were a string.Template type:
                          https://docs.python.org/2/library/string.html
                          #template-strings. Must define at least a "namespace".
    config                The config file where stack configuration is located.
                          Must be in yaml format. If `-` is provided, then the
                          config will be read from stdin.

  optional arguments:
    -h, --help            show this help message and exit
    -e ENV=VALUE, --env ENV=VALUE
                          Adds environment key/value pairs from the command
                          line. Overrides your environment file settings. Can be
                          specified more than once.
    -r REGION, --region REGION
                          The AWS region to launch in.
    -v, --verbose         Increase output verbosity. May be specified up to
                          twice.
    -i, --interactive     Enable interactive mode. If specified, this will use
                          the AWS interactive provider, which leverages
                          Cloudformation Change Sets to display changes before
                          running cloudformation templates. You'll be asked if
                          you want to execute each change set. If you only want
                          to authorize replacements, run with "--replacements-
                          only" as well.
    --replacements-only   If interactive mode is enabled, stacker will only
                          prompt to authorize replacements.
    -f, --force           Whether or not you want to go through with destroying
                          the stacks
    --stacks STACKNAME    Only work on the stacks given. Can be specified more
                          than once. If not specified then stacker will work on
                          all stacks in the config file.
    -t, --tail            Tail the CloudFormation logs while working with stacks

Info
----


Info displays information on the CloudFormation stacks based on the given
config.

::

  # stacker info -h
  usage: stacker info [-h] [-e ENV=VALUE] [-r REGION] [-v] [-i]
                      [--replacements-only] [--stacks STACKNAME]
                      environment config

  Gets information on the CloudFormation stacks based on the given config.

  positional arguments:
    environment           Path to a simple `key: value` pair environment file.
                          The values in the environment file can be used in the
                          stack config as if it were a string.Template type:
                          https://docs.python.org/2/library/string.html
                          #template-strings. Must define at least a "namespace".
    config                The config file where stack configuration is located.
                          Must be in yaml format. If `-` is provided, then the
                          config will be read from stdin.

  optional arguments:
    -h, --help            show this help message and exit
    -e ENV=VALUE, --env ENV=VALUE
                          Adds environment key/value pairs from the command
                          line. Overrides your environment file settings. Can be
                          specified more than once.
    -r REGION, --region REGION
                          The AWS region to launch in.
    -v, --verbose         Increase output verbosity. May be specified up to
                          twice.
    -i, --interactive     Enable interactive mode. If specified, this will use
                          the AWS interactive provider, which leverages
                          Cloudformation Change Sets to display changes before
                          running cloudformation templates. You'll be asked if
                          you want to execute each change set. If you only want
                          to authorize replacements, run with "--replacements-
                          only" as well.
    --replacements-only   If interactive mode is enabled, stacker will only
                          prompt to authorize replacements.
    --stacks STACKNAME    Only work on the stacks given. Can be specified more
                          than once. If not specified then stacker will work on
                          all stacks in the config file.

Diff
----

Diff creates a CloudFormation Change Set for each stack and displays the
resulting changes. This works for stacks that already exist and new stacks.

For stacks that are dependent on outputs from other stacks in the same file,
stacker will infer that an update was made to the "parent" stack and invalidate
outputs from resources that were changed and replace their value with
``<inferred-change: stackName.outputName=unresolvedValue>``. This is done to
illustrate the potential blast radius of a change and assist in tracking down
why subsequent stacks could change. This inference is not perfect but takes a
"best effort" approach to showing potential change between stacks that rely on
each others outputs.

::

  # stacker diff -h
  usage: stacker diff [-h] [-e ENV=VALUE] [-r REGION] [-v] [-i]
                      [--replacements-only] [--force STACKNAME]
                      [--stacks STACKNAME]
                      environment config

  Diffs the config against the currently running CloudFormation stacks Sometimes
  small changes can have big impacts. Run "stacker diff" before "stacker build"
  to detect bad things(tm) from happening in advance!

  positional arguments:
    environment           Path to a simple `key: value` pair environment file.
                          The values in the environment file can be used in the
                          stack config as if it were a string.Template type:
                          https://docs.python.org/2/library/string.html
                          #template-strings. Must define at least a "namespace".
    config                The config file where stack configuration is located.
                          Must be in yaml format. If `-` is provided, then the
                          config will be read from stdin.

  optional arguments:
    -h, --help            show this help message and exit
    -e ENV=VALUE, --env ENV=VALUE
                          Adds environment key/value pairs from the command
                          line. Overrides your environment file settings. Can be
                          specified more than once.
    -r REGION, --region REGION
                          The AWS region to launch in.
    -v, --verbose         Increase output verbosity. May be specified up to
                          twice.
    -i, --interactive     Enable interactive mode. If specified, this will use
                          the AWS interactive provider, which leverages
                          Cloudformation Change Sets to display changes before
                          running cloudformation templates. You'll be asked if
                          you want to execute each change set. If you only want
                          to authorize replacements, run with "--replacements-
                          only" as well.
    --replacements-only   If interactive mode is enabled, stacker will only
                          prompt to authorize replacements.
    --force STACKNAME     If a stackname is provided to --force, it will be
                          diffed, even if it is locked in the config.
    --stacks STACKNAME    Only work on the stacks given. Can be specified more
                          than once. If not specified then stacker will work on
                          all stacks in the config file.


================================================
FILE: docs/conf.py
================================================
# -*- coding: utf-8 -*-
#
# stacker documentation build configuration file, created by
# sphinx-quickstart on Fri Aug 14 09:59:29 2015.
#
# This file is execfile()d with the current directory set to its
# containing dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.

import sys
import os
import shlex

# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
sys.path.insert(0, os.path.abspath('..'))

import stacker

# -- General configuration ------------------------------------------------

# If your documentation needs a minimal Sphinx version, state it here.
#needs_sphinx = '1.0'

# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
    'sphinx.ext.autodoc',
    'sphinx.ext.doctest',
    'sphinx.ext.todo',
    'sphinx.ext.coverage',
    'sphinx.ext.viewcode',
    'sphinx.ext.napoleon',
]

# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']

# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
# source_suffix = ['.rst', '.md']
source_suffix = '.rst'

# The encoding of source files.
#source_encoding = 'utf-8-sig'

# The master toctree document.
master_doc = 'index'

# General information about the project.
project = u'stacker'
copyright = u'2015, Michael Barrett'
author = u'Michael Barrett'

# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
version = stacker.__version__
# The full version, including alpha/beta/rc tags.
release = stacker.__version__

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None

# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
#today = ''
# Else, today_fmt is used as the format for a strftime call.
#today_fmt = '%B %d, %Y'

# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = ['_build']

# The reST default role (used for this markup: `text`) to use for all
# documents.
#default_role = None

# If true, '()' will be appended to :func: etc. cross-reference text.
#add_function_parentheses = True

# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
#add_module_names = True

# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
#show_authors = False

# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'

# A list of ignored prefixes for module index sorting.
#modindex_common_prefix = []

# If true, keep warnings as "system message" paragraphs in the built documents.
#keep_warnings = False

# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = True


# -- Options for HTML output ----------------------------------------------

# The theme to use for HTML and HTML Help pages.  See the documentation for
# a list of builtin themes.
#html_theme = 'sphinx_rtd_theme'
html_theme = 'alabaster'

# Theme options are theme-specific and customize the look and feel of a theme
# further.  For a list of options available for each theme, see the
# documentation.
html_theme_options = {
    "description": "A Cloudformation Stack Manager",
    "github_button": True,
    "github_user": "cloudtools",
    "github_repo": "stacker",
    "github_banner": True,
}

# Add any paths that contain custom themes here, relative to this directory.
#html_theme_path = []

# The name for this set of Sphinx documents.  If None, it defaults to
# "<project> v<release> documentation".
#html_title = None

# A shorter title for the navigation bar.  Default is the same as html_title.
#html_short_title = None

# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
#html_logo = None

# The name of an image file (within the static path) to use as favicon of the
# docs.  This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
#html_favicon = None

# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']

# Add any extra paths that contain custom files (such as robots.txt or
# .htaccess) here, relative to this directory. These files are copied
# directly to the root of the documentation.
#html_extra_path = []

# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
# using the given strftime format.
#html_last_updated_fmt = '%b %d, %Y'

# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
#html_use_smartypants = True

# Custom sidebar templates, maps document names to template names.
#html_sidebars = {}

# Additional templates that should be rendered to pages, maps page names to
# template names.
#html_additional_pages = {}

# If false, no module index is generated.
#html_domain_indices = True

# If false, no index is generated.
#html_use_index = True

# If true, the index is split into individual pages for each letter.
#html_split_index = False

# If true, links to the reST sources are added to the pages.
#html_show_sourcelink = True

# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
#html_show_sphinx = True

# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
#html_show_copyright = True

# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it.  The value of this option must be the
# base URL from which the finished HTML is served.
#html_use_opensearch = ''

# This is the file name suffix for HTML files (e.g. ".xhtml").
#html_file_suffix = None

# Language to be used for generating the HTML full-text search index.
# Sphinx supports the following languages:
#   'da', 'de', 'en', 'es', 'fi', 'fr', 'hu', 'it', 'ja'
#   'nl', 'no', 'pt', 'ro', 'ru', 'sv', 'tr'
#html_search_language = 'en'

# A dictionary with options for the search language support, empty by default.
# Now only 'ja' uses this config value
#html_search_options = {'type': 'default'}

# The name of a javascript file (relative to the configuration directory) that
# implements a search results scorer. If empty, the default will be used.
#html_search_scorer = 'scorer.js'

# Output file base name for HTML help builder.
htmlhelp_basename = 'stackerdoc'

# -- Options for LaTeX output ---------------------------------------------

latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#'papersize': 'letterpaper',

# The font size ('10pt', '11pt' or '12pt').
#'pointsize': '10pt',

# Additional stuff for the LaTeX preamble.
#'preamble': '',

# Latex figure (float) alignment
#'figure_align': 'htbp',
}

# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
#  author, documentclass [howto, manual, or own class]).
latex_documents = [
  (master_doc, 'stacker.tex', u'stacker Documentation',
   u'Michael Barrett', 'manual'),
]

# The name of an image file (relative to this directory) to place at the top of
# the title page.
#latex_logo = None

# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
#latex_use_parts = False

# If true, show page references after internal links.
#latex_show_pagerefs = False

# If true, show URL addresses after external links.
#latex_show_urls = False

# Documents to append as an appendix to all manuals.
#latex_appendices = []

# If false, no module index is generated.
#latex_domain_indices = True


# -- Options for manual page output ---------------------------------------

# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
    (master_doc, 'stacker', u'stacker Documentation',
     [author], 1)
]

# If true, show URL addresses after external links.
#man_show_urls = False


# -- Options for Texinfo output -------------------------------------------

# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
#  dir menu entry, description, category)
texinfo_documents = [
  (master_doc, 'stacker', u'stacker Documentation',
   author, 'stacker', 'One line description of project.',
   'Miscellaneous'),
]

# Documents to append as an appendix to all manuals.
#texinfo_appendices = []

# If false, no module index is generated.
#texinfo_domain_indices = True

# How to display URL addresses: 'footnote', 'no', or 'inline'.
#texinfo_show_urls = 'footnote'

# If true, do not generate a @detailmenu in the "Top" node's menu.
#texinfo_no_detailmenu = False


================================================
FILE: docs/config.rst
================================================
=============
Configuration
=============

stacker makes use of a YAML formatted config file to define the different
CloudFormation stacks that make up a given environment.

The configuration file has a loose definition, with only a few top-level
keywords. Other than those keywords, you can define your own top-level keys
to make use of other YAML features like `anchors & references`_ to avoid
duplicating config. (See `YAML anchors & references`_ for details)

Top Level Keywords
==================

Namespace
---------

You can provide a **namespace** to create all stacks within. The namespace will
be used as a prefix for the name of any stack that stacker creates, and makes
it unnecessary to specify the fully qualified name of the stack in output
lookups.

In addition, this value will be used to create an S3 bucket that stacker will
use to upload and store all CloudFormation templates.

In general, this is paired with the concept of `Environments
<environments.html>`_ to create a namespace per environment::

  namespace: ${namespace}

Namespace Delimiter
-------------------

By default, stacker will use '-' as a delimiter between your namespace and the
declared stack name to build the actual CloudFormation stack name that gets
created. Since child resources of your stacks will, by default, use a portion
of your stack name in the auto-generated resource names, the first characters
of your fully-qualified stack name potentially convey valuable information to
someone glancing at resource names. If you prefer to not use a delimiter, you
can pass the **namespace_delimiter** top level key word in the config as an empty string.

See the `CloudFormation API Reference`_ for allowed stack name characters

.. _`CloudFormation API Reference`: http://docs.aws.amazon.com/AWSCloudFormation/latest/APIReference/API_CreateStack.html

S3 Bucket
---------

Stacker, by default, pushes your CloudFormation templates into an S3 bucket
and points CloudFormation at the template in that bucket when launching or
updating your stacks. By default it uses a bucket named
**stacker-${namespace}**, where the namespace is the namespace provided the
config.

If you want to change this, provide the **stacker_bucket** top level key word
in the config.

The bucket will be created in the same region that the stacks will be launched
in.  If you want to change this, or if you already have an existing bucket
in a different region, you can set the **stacker_bucket_region** to
the region where you want to create the bucket.

**S3 Bucket location prior to 1.0.4:**
  There was a "bug" early on in stacker that created the s3 bucket in us-east-1,
  no matter what you specified as your --region. An issue came up leading us to
  believe this shouldn't be the expected behavior, so we fixed the behavior.
  If you executed a stacker build prior to V 1.0.4, your bucket for templates
  would already exist in us-east-1, requiring you to specify the
  **stacker_bucket_region** top level keyword.

.. note::
  Deprecation of fallback to legacy template bucket. We will first try
  the region you defined using the top level keyword under
  **stacker_bucket_region**, or what was specified in the --region flag.
  If that fails, we fallback to the us-east-1 region. The fallback to us-east-1
  will be removed in a future release resulting in the following botocore
  excpetion to be thrown:

  ``TemplateURL must reference a valid S3 object to which you have access.``

  To avoid this issue, specify the stacker_bucket_region top level keyword
  as described above. You can specify this keyword now to remove the
  deprecation warning.

If you want stacker to upload templates directly to CloudFormation, instead of
first uploading to S3, you can set **stacker_bucket** to an empty string.
However, note that template size is greatly limited when uploading directly.
See the `CloudFormation Limits Reference`_.

.. _`CloudFormation Limits Reference`: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/cloudformation-limits.html

Module Paths
------------
When setting the ``classpath`` for blueprints/hooks, it is sometimes desirable to
load modules from outside the default ``sys.path`` (e.g., to include modules
inside the same repo as config files).

Adding a path (e.g. ``./``) to the **sys_path** top level key word will allow
modules from that path location to be used.

Service Role
------------

By default stacker doesn't specify a service role when executing changes to
CloudFormation stacks. If you would prefer that it do so, you can set
**service_role** to be the ARN of the service that stacker should use when
executing CloudFormation changes.

This is the equivalent of setting ``RoleARN`` on a call to the following
CloudFormation api calls: ``CreateStack``, ``UpdateStack``,
``CreateChangeSet``.

See the AWS documentation for `AWS CloudFormation Service Roles`_.

.. _`AWS CloudFormation Service Roles`: https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/using-iam-servicerole.html?icmpid=docs_cfn_console

Remote Packages
---------------
The **package_sources** top level keyword can be used to define remote
sources for blueprints (e.g., retrieving ``stacker_blueprints`` on github at
tag ``v1.0.2``).

The only required key for a git repository config is ``uri``, but ``branch``,
``tag``, & ``commit`` can also be specified::

    package_sources:
      git:
        - uri: git@github.com:acmecorp/stacker_blueprints.git
        - uri: git@github.com:remind101/stacker_blueprints.git
          tag: 1.0.0
          paths:
            - stacker_blueprints
        - uri: git@github.com:contoso/webapp.git
          branch: staging
        - uri: git@github.com:contoso/foo.git
          commit: 12345678

If no specific commit or tag is specified for a repo, the remote repository
will be checked for newer commits on every execution of Stacker.

For ``.tar.gz`` & ``zip`` archives on s3, specify a ``bucket`` & ``key``::

    package_sources:
      s3:
        - bucket: mystackers3bucket
          key: archives/blueprints-v1.zip
          paths:
            - stacker_blueprints
        - bucket: anothers3bucket
          key: public/public-blueprints-v2.tar.gz
          requester_pays: true
        - bucket: yetanothers3bucket
          key: sallys-blueprints-v1.tar.gz
          # use_latest defaults to true - will update local copy if the
          # last modified date on S3 changes
          use_latest: false

Local directories can also be specified::

    package_sources:
      local:
        - source: ../vpc

Use the ``paths`` option when subdirectories of the repo/archive/directory
should be added to Stacker's ``sys.path``.

Cloned repos/archives will be cached between builds; the cache location defaults
to ~/.stacker but can be manually specified via the **stacker_cache_dir** top
level keyword.

Remote Configs
~~~~~~~~~~~~~~
Configuration yamls from remote configs can also be used by specifying a list
of ``configs`` in the repo to use::

    package_sources:
      git:
        - uri: git@github.com:acmecorp/stacker_blueprints.git
          configs:
            - vpc.yaml

In this example, the configuration in ``vpc.yaml`` will be merged into the
running current configuration, with the current configuration's values taking
priority over the values in ``vpc.yaml``.

Dictionary Stack Names & Hook Paths
:::::::::::::::::::::::::::::::::::
To allow remote configs to be selectively overriden, stack names & hook
paths can optionally be defined as dictionaries, e.g.::

  pre_build:
    my_route53_hook:
      path: stacker.hooks.route53.create_domain:
      required: true
      enabled: true
      args:
        domain: mydomain.com
  stacks:
    vpc-example:
      class_path: stacker_blueprints.vpc.VPC
      locked: false
      enabled: true
    bastion-example:
      class_path: stacker_blueprints.bastion.Bastion
      locked: false
      enabled: true

Pre & Post Hooks
----------------

Many actions allow for pre & post hooks. These are python methods that are
executed before, and after the action is taken for the entire config. Hooks 
can be enabled or disabled, per hook. Only the following actions allow
pre/post hooks:

* build (keywords: *pre_build*, *post_build*)
* destroy (keywords: *pre_destroy*, *post_destroy*)

There are a few reasons to use these, though the most common is if you want
better control over the naming of a resource than what CloudFormation allows.

The keyword is a list of dictionaries with the following keys:

**path:**
  the python import path to the hook
**data_key:**
  If set, and the hook returns data (a dictionary), the results will be stored
  in the context.hook_data with the data_key as its key.
**required:**
  whether to stop execution if the hook fails
**enabled:**
  whether to execute the hook every stacker run. Default: True. This is a bool
  that grants you the ability to execute a hook per environment when combined
  with a variable pulled from an environment file.
**args:**
  a dictionary of arguments to pass to the hook

An example using the *create_domain* hook for creating a route53 domain before
the build action::

  pre_build:
    - path: stacker.hooks.route53.create_domain
      required: true
      enabled: true
      args:
        domain: mydomain.com

An example of a hook using the ``create_domain_bool`` variable from the environment
file to determine if hook should run. Set ``create_domain_bool: true`` or
``create_domain_bool: false`` in the environment file to determine if the hook
should run in the environment stacker is running against::

  pre_build:
    - path: stacker.hooks.route53.create_domain
      required: true
      enabled: ${create_domain_bool}
      args:
        domain: mydomain.com

Tags
----

CloudFormation supports arbitrary key-value pair tags. All stack-level, including automatically created tags, are
propagated to resources that AWS CloudFormation supports. See `AWS CloudFormation Resource Tags Type`_ for more details.
If no tags are specified, the `stacker_namespace` tag is applied to your stack with the value of `namespace` as the
tag value.

If you prefer to apply a custom set of tags, specify the top-level keyword `tags` as a map. Example::

  tags:
    "hello": world
    "my_tag:with_colons_in_key": ${dynamic_tag_value_from_my_env}
    simple_tag: simple value

If you prefer to have no tags applied to your stacks (versus the default tags that stacker applies), specify an empty
map for the top-level keyword::

  tags: {}

.. _`AWS CloudFormation Resource Tags Type`: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-resource-tags.html

Mappings
--------

Mappings are dictionaries that are provided as Mappings_ to each CloudFormation
stack that stacker produces.

These can be useful for providing things like different AMIs for different
instance types in different regions::

  mappings:
    AmiMap:
      us-east-1:
        NAT: ami-ad227cc4
        ubuntu1404: ami-74e27e1c
        bastion: ami-74e27e1c
      us-west-2:
        NAT: ami-290f4119
        ubuntu1404: ami-5189a661
        bastion: ami-5189a661

These can be used in each blueprint/stack as usual.

Lookups
-------

Lookups allow you to create custom methods which take a value and are
resolved at build time. The resolved values are passed to the `Blueprints
<blueprints.html>`_ before it is rendered. For more information, see the
`Lookups <lookups.html>`_ documentation.

stacker provides some common `lookups <lookups.html>`_, but it is
sometimes useful to have your own custom lookup that doesn't get shipped
with stacker. You can register your own lookups by defining a `lookups`
key::

  lookups:
    custom: path.to.lookup.handler

The key name for the lookup will be used as the type name when registering
the lookup. The value should be the path to a valid lookup handler.

You can then use these within your config::

  conf_value: ${custom some-input-here}


Stacks
------

This is the core part of the config - this is where you define each of the
stacks that will be deployed in the environment.  The top level keyword
*stacks* is populated with a list of dictionaries, each representing a single
stack to be built.

A stack has the following keys:

**name:**
  The logical name for this stack, which can be used in conjuction with the
  ``output`` lookup. The value here must be unique within the config. If no
  ``stack_name`` is provided, the value here will be used for the name of the
  CloudFormation stack.
**class_path:**
  The python class path to the Blueprint to be used. Specify this or
  ``template_path`` for the stack.
**template_path:**
  Path to raw CloudFormation template (JSON or YAML). Specify this or
  ``class_path`` for the stack. Path can be specified relative to the current
  working directory (e.g. templates stored alongside the Config), or relative
  to a directory in the python ``sys.path`` (i.e. for loading templates
  retrieved via ``packages_sources``).

**description:**
  A short description to apply to the stack. This overwrites any description
  provided in the Blueprint. See: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/template-description-structure.html
**variables:**
  A dictionary of Variables_ to pass into the Blueprint when rendering the
  CloudFormation template. Variables_ can be any valid YAML data
  structure.
**locked:**
  (optional) If set to true, the stack is locked and will not be
  updated unless the stack is passed to stacker via the *--force* flag.
  This is useful for *risky* stacks that you don't want to take the
  risk of allowing CloudFormation to update, but still want to make
  sure get launched when the environment is first created. When ``locked``,
  it's not necessary to specify a ``class_path`` or ``template_path``.
**enabled:**
  (optional) If set to false, the stack is disabled, and will not be
  built or updated. This can allow you to disable stacks in different
  environments.
**protected:**
  (optional) When running an update in non-interactive mode, if a stack has
  *protected* set to *true* and would get changed, stacker will switch to
  interactive mode for that stack, allowing you to approve/skip the change.
**requires:**
  (optional) a list of other stacks this stack requires. This is for explicit
  dependencies - you do not need to set this if you refer to another stack in
  a Parameter, so this is rarely necessary.
**required_by:**
  (optional) a list of other stacks or targets that require this stack. It's an
  inverse to ``requires``.
**tags:**
  (optional) a dictionary of CloudFormation tags to apply to this stack. This
  will be combined with the global tags, but these tags will take precendence.
**stack_name:**
  (optional) If provided, this will be used as the name of the CloudFormation
  stack. Unlike ``name``, the value doesn't need to be unique within the config,
  since you could have multiple stacks with the same name, but in different
  regions or accounts. (note: the namespace from the environment will be
  prepended to this)
**region**:
  (optional): If provided, specifies the name of the region that the
  CloudFormation stack should reside in. If not provided, the default region
  will be used (``AWS_DEFAULT_REGION``, ``~/.aws/config`` or the ``--region``
  flag). If both ``region`` and ``profile`` are specified, the value here takes
  precedence over the value in the profile.
**profile**:
  (optional): If provided, specifies the name of a AWS profile to use when
  performing AWS API calls for this stack. This can be used to provision stacks
  in multiple accounts or regions.
**stack_policy_path**:
  (optional): If provided, specifies the path to a JSON formatted stack policy
  that will be applied when the CloudFormation stack is created and updated.
  You can use stack policies to prevent CloudFormation from making updates to
  protected resources (e.g. databases). See: https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/protect-stack-resources.html
**in_progress_behavior**:
  (optional): If provided, specifies the behavior for when a stack is in
  `CREATE_IN_PROGRESS` or `UPDATE_IN_PROGRESS`. By default, stacker will raise
  an exception if the stack is in an `IN_PROGRESS` state. You can set this
  option to `wait` and stacker will wait for the previous update to complete
  before attempting to update the stack.
**notification_arns**:
  (optional): If provided, accepts a list of None or many AWS SNS Topic ARNs
  which will be notified of this stack's CloudFormation state changes.

Stacks Example
~~~~~~~~~~~~~~

Here's an example from stacker_blueprints_, used to create a VPC::

  stacks:
    - name: vpc-example
      class_path: stacker_blueprints.vpc.VPC
      locked: false
      enabled: true
      variables:
        InstanceType: t2.small
        SshKeyName: default
        ImageName: NAT
        AZCount: 2
        PublicSubnets:
          - 10.128.0.0/24
          - 10.128.1.0/24
          - 10.128.2.0/24
          - 10.128.3.0/24
        PrivateSubnets:
          - 10.128.8.0/22
          - 10.128.12.0/22
          - 10.128.16.0/22
          - 10.128.20.0/22
        CidrBlock: 10.128.0.0/16

Targets
-------

In stacker, **targets** can be used as a lightweight method to group a number
of stacks together, as a named "target" in the graph. Internally, this adds a
node to the underlying DAG, which can then be used alongside the `--targets`
flag. If you're familiar with the concept of "targets" in systemd, the concept
is the same.

**name:**
  The logical name for this target.
**requires:**
  (optional) a list of stacks or other targets this target requires.
**required_by:**
  (optional) a list of stacks or other targets that require this target.

Here's an example of a target that will execute all "database" stacks::

  targets:
    - name: databases

  stacks:
    - name: dbA
      class_path: blueprints.DB
      required_by:
        - databases
    - name: dbB
      class_path: blueprints.DB
      required_by:
        - databases

Custom Log Formats
------------------

By default, stacker uses the following `log_formats`::

  log_formats:
    info: "[%(asctime)s] %(message)s"
    color: "[%(asctime)s] \033[%(color)sm%(message)s\033[39m"
    debug: "[%(asctime)s] %(levelname)s %(threadName)s %(name)s:%(lineno)d(%(funcName)s): %(message)s"

You may optionally provide custom `log_formats`. In this example, we add the environment name to each log line::

  log_formats:
    info: "[%(asctime)s] ${environment} %(message)s"
    color: "[%(asctime)s] ${environment} \033[%(color)sm%(message)s\033[39m"
    
You may use any of the standard Python
`logging module format attributes <https://docs.python.org/2.7/library/logging.html#logrecord-attributes>`_
when building your `log_formats`.


Variables
==========

Variables are values that will be passed into a `Blueprint
<blueprints.html>`_ before it is
rendered. Variables can be any valid YAML data structure and can leverage
Lookups_ to expand values at build time.

The following concepts make working with variables within large templates
easier:

YAML anchors & references
-------------------------

If you have a common set of variables that you need to pass around in many
places, it can be annoying to have to copy and paste them in multiple places.
Instead, using a feature of YAML known as `anchors & references`_, you can
define common values in a single place and then refer to them with a simple
syntax.

For example, say you pass a common domain name to each of your stacks, each of
them taking it as a Variable. Rather than having to enter the domain into
each stack (and hopefully not typo'ing any of them) you could do the
following::

  domain_name: &domain mydomain.com

Now you have an anchor called **domain** that you can use in place of any value
in the config to provide the value **mydomain.com**. You use the anchor with
a reference::

  stacks:
    - name: vpc
      class_path: stacker_blueprints.vpc.VPC
      variables:
        DomainName: *domain

Even more powerful is the ability to anchor entire dictionaries, and then
reference them in another dictionary, effectively providing it with default
values. For example::

  common_variables: &common_variables
    DomainName: mydomain.com
    InstanceType: m3.medium
    AMI: ami-12345abc

Now, rather than having to provide each of those variables to every stack that
could use them, you can just do this instead::

  stacks:
    - name: vpc
      class_path: stacker_blueprints.vpc.VPC
      variables:
        << : *common_variables
        InstanceType: c4.xlarge # override the InstanceType in this stack

Using Outputs as Variables
---------------------------

Since stacker encourages the breaking up of your CloudFormation stacks into
entirely separate stacks, sometimes you'll need to pass values from one stack
to another. The way this is handled in stacker is by having one stack
provide Outputs_ for all the values that another stack may need, and then
using those as the inputs for another stack's Variables_. stacker makes
this easier for you by providing a syntax for Variables_ that will cause
stacker to automatically look up the values of Outputs_ from another stack
in its config. To do so, use the following format for the Variable on the
target stack::

  MyParameter: ${output OtherStack::OutputName}

Since referencing Outputs_ from stacks is the most common use case,
`output` is the default lookup type. For more information see Lookups_.

This example is taken from stacker_blueprints_ example config - when building
things inside a VPC, you will need to pass the *VpcId* of the VPC that you
want the resources to be located in. If the *vpc* stack provides an Output
called *VpcId*, you can reference it easily::

  domain_name: my_domain &domain

  stacks:
    - name: vpc
      class_path: stacker_blueprints.vpc.VPC
      variables:
        DomainName: *domain
    - name: webservers
      class_path: stacker_blueprints.asg.AutoscalingGroup
      variables:
        DomainName: *domain
        VpcId: ${output vpc::VpcId} # gets the VpcId Output from the vpc stack

Note: Doing this creates an implicit dependency from the *webservers* stack
to the *vpc* stack, which will cause stacker to submit the *vpc* stack, and
then wait until it is complete until it submits the *webservers* stack.

Multi Account/Region Provisioning
---------------------------------

You can use stacker to manage CloudFormation stacks in multiple accounts and
regions, and reference outputs across them.

As an example, let's say you had 3 accounts you wanted to manage:

#) OpsAccount: An AWS account that has IAM users for employees.
#) ProdAccount: An AWS account for a "production" environment.
#) StageAccount: An AWS account for a "staging" environment.

You want employees with IAM user accounts in OpsAccount to be able to assume
roles in both the ProdAccount and StageAccount. You can use stacker to easily
manage this::


  stacks:
    # Create some stacks in both the "prod" and "stage" accounts with IAM roles
    # that employees can use.
    - name: prod/roles
      profile: prod
      class_path: blueprints.Roles
    - name: stage/roles
      profile: stage
      class_path: blueprints.Roles

    # Create a stack in the "ops" account and grant each employee access to
    # assume the roles we created above.
    - name: users
      profile: ops
      class_path: blueprints.IAMUsers
      variables:
        Users:
          john-smith:
            Roles:
              - ${output prod/roles::EmployeeRoleARN}
              - ${output stage/roles::EmployeeRoleARN}


Note how I was able to reference outputs from stacks in multiple accounts using the `output` plugin!

Environments
============

A pretty common use case is to have separate environments that you want to
look mostly the same, though with some slight modifications. For example, you
might want a *production* and a *staging* environment. The production
environment likely needs more instances, and often those instances will be
of a larger instance type. Environments allow you to use your existing
stacker config, but provide different values based on the environment file
chosen on the command line. For more information, see the
`Environments <environments.html>`_ documentation.

Translators
===========

.. note::
  Translators have been deprecated in favor of Lookups_ and will be
  removed in a future release.

Translators allow you to create custom methods which take a value, then modify
it before passing it on to the stack. Currently this is used to allow you to
pass a KMS encrypted string as a Parameter, then have KMS decrypt it before
submitting it to CloudFormation. For more information, see the
`Translators <translators.html>`_ documentation.

.. _`anchors & references`: https://en.wikipedia.org/wiki/YAML#Repeated_nodes
.. _Mappings: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/mappings-section-structure.html
.. _Outputs: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/outputs-section-structure.html
.. _stacker_blueprints: https://github.com/cloudtools/stacker_blueprints
.. _`AWS profiles`: https://docs.aws.amazon.com/cli/latest/userguide/cli-multiple-profiles.html


================================================
FILE: docs/environments.rst
================================================
============
Environments
============

When running stacker, you can optionally provide an "environment" file. The
environment file defines values, which can then be referred to by name from
your stack config file. The environment file is interpreted as YAML if it
ends in `.yaml` or `.yml`, otherwise it's interpreted as simple key/value
pairs.

Key/Value environments
----------------------

The stacker config file will be interpolated as a `string.Template
<https://docs.python.org/2/library/string.html#template-strings>`_ using the
key/value pairs from the environment file. The format of the file is a single
key/value per line, separated by a colon (**:**), like this::

  vpcID: vpc-12345678

Provided the key/value vpcID above, you will now be able to use this in
your configs for the specific environment you are deploying into. They
act as keys that can be used in your config file, providing a sort of
templating ability. This allows you to change the values of your config
based on the environment you are in. For example, if you have a *webserver*
stack, and you need to provide it a variable for the instance size it
should use, you would have something like this in your config file::

  stacks:
    - name: webservers
      class_path: stacker_blueprints.asg.AutoscalingGroup
      variables:
        InstanceType: m3.medium

But what if you needed more CPU in your production environment, but not in your
staging? Without Environments, you'd need a separate config for each. With
environments, you can simply define two different environment files with the
appropriate *InstanceType* in each, and then use the key in the environment
files in your config. For example::

  # in the file: stage.env
  web_instance_type: m3.medium

  # in the file: prod.env
  web_instance_type: c4.xlarge

  # in your config file:
  stacks:
    - name: webservers
      class_path: stacker_blueprints.asg.AutoscalingGroup
      variables:
        InstanceType: ${web_instance_type}

YAML environments
-----------------

YAML environments allow for more complex environment configuration rather
than simple text substitution, and support YAML features like anchors and
references. To build on the example above, let's define a stack that's
a little more complex::

  stacks:
    - name: webservers
      class_path: stacker_blueprints.asg.AutoscalingGroup
      variables:
        InstanceType: ${web_instance_type}
        IngressCIDRsByPort: ${ingress_cidrs_by_port}

We've defined a stack which expects a list of ingress CIDR's allowed access to
each port. Our environment files would look like this::

  # in the file: stage.yml
  web_instance_type: m3.medium
  ingress_cidrs_by_port:
    80:
      - 192.168.1.0/8
    8080:
      - 0.0.0.0/0

  # in the file: prod.env
  web_instance_type: c4.xlarge
  ingress_cidrs_by_port:
    80:
      - 192.168.1.0/8
    443:
      - 10.0.0.0/16
      - 10.1.0.0/16

The YAML format allows for specifying lists, maps, and supports all `pyyaml`
functionality allowed in `safe_load()` function.

Variable substitution in the YAML case is a bit more complex than in the
`string.Template` case. Objects can only be substituted for variables in the
case where we perform a full substitution, such as this::

  vpcID: ${vpc_variable}

We can not substitute an object in a sub-string, such as this::

  vpcID: prefix-${vpc_variable}

It makes no sense to substitute a complex object in this case, and we will raise
an error if that happens. You can still perform this substitution with
primitives; numbers, strings, but not dicts or lists.

.. note::
  Namespace defined in the environment file has been deprecated in favor of
  defining the namespace in the config and will be removed in a future release.


================================================
FILE: docs/index.rst
================================================
.. stacker documentation master file, created by
   sphinx-quickstart on Fri Aug 14 09:59:29 2015.
   You can adapt this file completely to your liking, but it should at least
   contain the root `toctree` directive.

Welcome to stacker's documentation!
===================================

stacker is a tool and library used to create & update multiple CloudFormation
stacks. It was originally written at Remind_ and
released to the open source community.

stacker Blueprints are written in troposphere_, though the purpose of
most templates is to keep them as generic as possible and then use
configuration to modify them.

At Remind we use stacker to manage all of our Cloudformation stacks -
both in development, staging and production without any major issues.


Main Features
-------------

- Easily `Create/Update <commands.html#build>`_/`Destroy <commands.html#destroy>`_
  many stacks in parallel (though with an understanding of cross-stack
  dependencies)
- Makes it easy to manage large environments in a single config, while still
  allowing you to break each part of the environment up into its own
  completely separate stack.
- Manages dependencies between stacks, only launching one after all the stacks
  it depends on are finished.
- Only updates stacks that have changed and that have not been explicitly
  locked or disabled.
- Easily pass Outputs from one stack in as Variables on another (which also
  automatically provides an implicit dependency)
- Use `Environments <environments.html>`_ to manage slightly different
  configuration in different environments.
- Use `Lookups <lookups.html>`_ to allow dynamic fetching or altering of
  data used in Variables.
- A diff command for diffing your config against what is running in a live
  CloudFormation environment.
- A small library of pre-shared Blueprints can be found at the
  stacker_blueprints_ repo, making things like setting up a VPC easy.


Contents:

.. toctree::
   :maxdepth: 2

   organizations_using_stacker
   terminology
   config
   environments
   translators
   lookups
   commands
   blueprints
   templates
   API Docs <api/modules>



Indices and tables
==================

* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

.. _Remind: http://www.remind.com/
.. _troposphere: https://github.com/cloudtools/troposphere
.. _stacker_blueprints: https://github.com/cloudtools/stacker_blueprints


================================================
FILE: docs/lookups.rst
================================================
=======
Lookups
=======

Stacker provides the ability to dynamically replace values in the config via a
concept called lookups. A lookup is meant to take a value and convert
it by calling out to another service or system.

A lookup is denoted in the config with the ``${<lookup type> <lookup
input>}`` syntax. If ``<lookup type>`` isn't provided, stacker will
fall back to use the ``output`` lookup .

Lookups are only resolved within `Variables
<terminology.html#variables>`_. They can be nested in any part of a YAML
data structure and within another lookup itself.

.. note::
  If a lookup has a non-string return value, it can be the only lookup
  within a value.

  ie. if `custom` returns a list, this would raise an exception::

    Variable: ${custom something}, ${output otherStack::Output}

  This is valid::

    Variable: ${custom something}


For example, given the following::

  stacks:
    - name: sg
      class_path: some.stack.blueprint.Blueprint
      variables:
        Roles:
          - ${output otherStack::IAMRole}
        Values:
          Env:
            Custom: ${custom ${output otherStack::Output}}
            DBUrl: postgres://${output dbStack::User}@${output dbStack::HostName}

The Blueprint would have access to the following resolved variables
dictionary::

  # variables
  {
    "Roles": ["other-stack-iam-role"],
    "Values": {
      "Env": {
        "Custom": "custom-output",
        "DBUrl": "postgres://user@hostname",
      },
    },
  }


stacker includes the following lookup types:

  - `output lookup`_
  - `ami lookup`_
  - `custom lookup`_
  - `default lookup`_
  - `dynamodb lookup`_
  - `envvar lookup`_
  - `file lookup`_
  - `hook_data lookup`_
  - `kms lookup`_
  - `rxref lookup`_
  - `ssmstore lookup`_
  - `xref lookup`_

.. _`output lookup`:

Output Lookup
-------------

The ``output`` lookup takes a value of the format:
``<stack name>::<output name>`` and retrieves the output from the given stack
name within the current namespace.

stacker treats output lookups differently than other lookups by auto
adding the referenced stack in the lookup as a requirement to the stack
whose variable the output value is being passed to.

You can specify an output lookup with the following syntax::

  ConfVariable: ${output someStack::SomeOutput}


.. _`default lookup`:

default Lookup
--------------

The ``default`` lookup type will check if a value exists for the variable
in the environment file, then fall back to a default defined in the stacker
config if the environment file doesn't contain the variable. This allows
defaults to be set at the config file level, while granting the user the
ability to override that value per environment.

Format of value::
  <env_var>::<default value>

For example::
  Groups: ${default app_security_groups::sg-12345,sg-67890}

If `app_security_groups` is defined in the environment file, its defined
value will be returned. Otherwise, `sg-12345,sg-67890` will be the returned
value.

.. note::
  The ``default`` lookup only supports checking if a variable is defined in
  an environment file. It does not support other embedded lookups to see
  if they exist. Only checking variables in the environment file are supported.
  If you attempt to have the default lookup perform any other lookup that
  fails, stacker will throw an exception for that lookup and will stop your
  build before it gets a chance to fall back to the default in your config.

.. _`kms lookup`:

KMS Lookup
----------

The ``kms`` lookup type decrypts its input value.

As an example, if you have a database and it has a parameter called
``DBPassword`` that you don't want to store in clear text in your config
(maybe because you want to check it into your version control system to
share with the team), you could instead encrypt the value using ``kms``.

For example::

  # We use the aws cli to get the encrypted value for the string
  # "PASSWORD" using the master key called 'myStackerKey' in us-east-1
  $ aws --region us-east-1 kms encrypt --key-id alias/myStackerKey \
      --plaintext "PASSWORD" --output text --query CiphertextBlob

  CiD6bC8t2Y<...encrypted blob...>

  # In stacker we would reference the encrypted value like:
  DBPassword: ${kms us-east-1@CiD6bC8t2Y<...encrypted blob...>}

  # The above would resolve to
  DBPassword: PASSWORD

This requires that the person using stacker has access to the master key used
to encrypt the value.

It is also possible to store the encrypted blob in a file (useful if the
value is large) using the ``file://`` prefix, ie::

  DockerConfig: ${kms file://dockercfg}

.. note::
  Lookups resolve the path specified with `file://` relative to
  the location of the config file, not where the stacker command is run.

.. _`xref lookup`:

XRef Lookup
-----------

The ``xref`` lookup type is very similar to the ``output`` lookup type, the
difference being that ``xref`` resolves output values from stacks that
aren't contained within the current stacker namespace, but are existing stacks
containing outputs within the same region on the AWS account you are deploying
into. ``xref`` allows you to lookup these outputs from the stacks already on
your account by specifying the stacks fully qualified name in the
CloudFormation console.

Where the ``output`` type will take a stack name and use the current context
to expand the fully qualified stack name based on the namespace, ``xref``
skips this expansion because it assumes you've provided it with
the fully qualified stack name already. This allows you to reference
output values from any CloudFormation stack in the same region.

Also, unlike the ``output`` lookup type, ``xref`` doesn't impact stack
requirements.

For example::

  ConfVariable: ${xref fully-qualified-stack::SomeOutput}

.. _`rxref lookup`:

RXRef Lookup
------------

The ``rxref`` lookup type is very similar to the ``xref`` lookup type,
the difference being that ``rxref`` will lookup output values from stacks
that are relative to the current namespace but external to the stack, but
will not resolve them. ``rxref`` assumes the stack containing the output
already exists.

Where the ``xref`` type assumes you provided a fully qualified stack name,
``rxref``, like ``output`` expands and retrieves the output from the given
stack name within the current namespace, even if not defined in the stacker
config you provided it.

Because there is no requirement to keep all stacks defined within the same
stacker YAML config, you might need the ability to read outputs from other
stacks deployed by stacker into your same account under the same namespace.
``rxref`` gives you that ability. This is useful if you want to break up
very large configs into smaller groupings.

Also, unlike the ``output`` lookup type, ``rxref`` doesn't impact stack
requirements.

For example::

  # in stacker.env
  namespace: MyNamespace

  # in stacker.yml
  ConfVariable: ${rxref my-stack::SomeOutput}

  # the above would effectively resolve to
  ConfVariable: ${xref MyNamespace-my-stack::SomeOutput}

Although possible, it is not recommended to use ``rxref`` for stacks defined
within the same stacker YAML config.

.. _`file lookup`:

File Lookup
-----------

The ``file`` lookup type allows the loading of arbitrary data from files on
disk. The lookup additionally supports using a ``codec`` to manipulate or
wrap the file contents prior to injecting it. The parameterized-b64 ``codec``
is particularly useful to allow the interpolation of CloudFormation parameters
in a UserData attribute of an instance or launch configuration.

Basic examples::

  # We've written a file to /some/path:
  $ echo "hello there" > /some/path

  # In stacker we would reference the contents of this file with the following
  conf_key: ${file plain:file://some/path}

  # The above would resolve to
  conf_key: hello there

  # Or, if we used wanted a base64 encoded copy of the file data
  conf_key: ${file base64:file://some/path}

  # The above would resolve to
  conf_key: aGVsbG8gdGhlcmUK

Supported codecs:
 - plain - load the contents of the file untouched. This is the only codec that should be used
   with raw Cloudformation templates (the other codecs are intended for blueprints).
 - base64 - encode the plain text file at the given path with base64 prior
   to returning it
 - parameterized - the same as plain, but additionally supports
   referencing CloudFormation parameters to create userdata that's
   supplemented with information from the template, as is commonly needed
   in EC2 UserData. For example, given a template parameter of BucketName,
   the file could contain the following text::

     #!/bin/sh
     aws s3 sync s3://{{BucketName}}/somepath /somepath

   and then you could use something like this in the YAML config file::

     UserData: ${file parameterized:/path/to/file}

   resulting in the UserData parameter being defined as::

     { "Fn::Join" : ["", [
       "#!/bin/sh\naws s3 sync s3://",
       {"Ref" : "BucketName"},
       "/somepath /somepath"
     ]] }

 - parameterized-b64 - the same as parameterized, with the results additionally
   wrapped in { "Fn::Base64": ... } , which is what you actually need for
   EC2 UserData
 - json - decode the file as JSON and return the resulting object
 - json-parameterized - Same as ``json``, but applying templating rules from
   ``parameterized`` to every object *value*. Note that object *keys* are not
   modified. Example (an external PolicyDocument)::

     {
      "Version": "2012-10-17",
      "Statement": [
        {
          "Effect": "Allow",
          "Action": [
            "some:Action"
          ],
          "Resource": "{{MyResource}}"
        }
      ]
     }

 - yaml - decode the file as YAML and return the resulting object. All strings
   are returned as ``unicode`` even in Python 2.
 - yaml-parameterized - Same as ``json-parameterized``, but using YAML. Example::

     Version: 2012-10-17
     Statement
       - Effect: Allow
         Action:
           - "some:Action"
         Resource: "{{MyResource}}"


When using parameterized-b64 for UserData, you should use a local_parameter defined
as such::

  from troposphere import AWSHelperFn

  "UserData": {
    "type": AWSHelperFn,
    "description": "Instance user data",
    "default": Ref("AWS::NoValue")
  }

and then assign UserData in a LaunchConfiguration or Instance to self.get_variables()["UserData"].
Note that we use AWSHelperFn as the type because the parameterized-b64 codec returns either a
Base64 or a GenericHelperFn troposphere object.

.. _`ssmstore lookup`:

SSM Parameter Store Lookup
--------------------------

The ``ssmstore`` lookup type retrieves a value from the Simple Systems
Manager Parameter Store.

As an example, if you have a database and it has a parameter called
``DBUser`` that you don't want to store in clear text in your config,
you could instead store it as a SSM parameter named ``MyDBUser``.

For example::

  # We use the aws cli to store the database username
  $ aws ssm put-parameter --name "MyDBUser" --type "String" \
      --value "root"

  # In stacker we would reference the value like:
  DBUser: ${ssmstore us-east-1@MyDBUser}

  # Which would resolve to:
  DBUser: root

Encrypted values ("SecureStrings") can also be used, which will be
automatically decrypted (assuming the Stacker user has access to the
associated KMS key). Care should be taken when using this with encrypted
values (i.e. a safe policy is to only use it with ``no_echo`` CFNString
values)

The region can be omitted (e.g. ``DBUser: ${ssmstore MyDBUser}``), in which
case ``us-east-1`` will be assumed.

.. _`dynamodb lookup`:

DynamoDb Lookup
--------------------------

The ``dynamodb`` lookup type retrieves a value from a DynamoDb table.

As an example, if you have a Dynamo Table named ``TestTable`` and it has an Item
with a Primary Partition key called ``TestKey`` and a value named ``BucketName``
, you can look it up by using Stacker. The lookup key in this case is TestVal

For example::

  # We can reference that dynamo value
  BucketName: ${dynamodb us-east-1:TestTable@TestKey:TestVal.BucketName}

  # Which would resolve to:
  BucketName: stacker-test-bucket

You can lookup other data types by putting the data type in the lookup. Valid
values are "S"(String), "N"(Number), "M"(Map), "L"(List).

For example::

  ServerCount: ${dynamodb us-east-1:TestTable@TestKey:TestVal.ServerCount[N]}

  This would return an int value, rather than a string

You can lookup values inside of a map:

For example::

  ServerCount: ${dynamodb us-east-1:TestTable@TestKey:TestVal.ServerInfo[M].
                                                                ServerCount[N]}


.. _`envvar lookup`:

Shell Environment Lookup
------------------------

The ``envvar`` lookup type retrieves a value from a variable in the shell's
environment.

Example::

  # Set an environment variable in the current shell.
  $ export DATABASE_USER=root

  # In the stacker config we could reference the value:
  DBUser: ${envvar DATABASE_USER}

  # Which would resolve to:
  DBUser: root

You can also get the variable name from a file, by using the ``file://`` prefix
in the lookup, like so::

  DBUser: ${envvar file://dbuser_file.txt}

.. _`ami lookup`:

EC2 AMI Lookup
--------------

The ``ami`` lookup is meant to search for the most recent AMI created that
matches the given filters.

Valid arguments::

  region OPTIONAL ONCE:
      e.g. us-east-1@

  owners (comma delimited) REQUIRED ONCE:
      aws_account_id | amazon | self

  name_regex (a regex) REQUIRED ONCE:
      e.g. my-ubuntu-server-[0-9]+

  executable_users (comma delimited) OPTIONAL ONCE:
      aws_account_id | amazon | self

Any other arguments specified are sent as filters to the aws api
For example, "architecture:x86_64" will add a filter.

Example::

  # Grabs the most recently created AMI that is owned by either this account,
  # amazon, or the account id 888888888888 that has a name that matches
  # the regex "server[0-9]+" and has "i386" as its architecture.

  # Note: The region is optional, and defaults to the current stacker region
  ImageId: ${ami [<region>@]owners:self,888888888888,amazon name_regex:server[0-9]+ architecture:i386}

.. _`hook_data lookup`:

Hook Data Lookup
----------------

When using hooks, you can have the hook store results in the
`hook_data`_ dictionary on the context by setting *data_key* in the hook
config.

This lookup lets you look up values in that dictionary. A good example of this
is when you use the `aws_lambda hook`_ to upload AWS Lambda code, then need to
pass that code object as the *Code* variable in the `aws_lambda blueprint`_
dictionary.

Example::

  # If you set the "data_key" config on the aws_lambda hook to be "myfunction"
  # and you name the function package "TheCode" you can get the troposphere
  # awslambda.Code object with:

  Code: ${hook_data myfunction::TheCode}

.. _`custom lookup`:

Custom Lookup
--------------

A custom lookup may be registered within the config.
For more information see `Configuring Lookups <config.html#lookups>`_.


.. _`hook_data`: http://stacker.readthedocs.io/en/latest/config.html#pre-post-hooks
.. _`aws_lambda hook`: http://stacker.readthedocs.io/en/latest/api/stacker.hooks.html#stacker.hooks.aws_lambda.upload_lambda_functions
.. _`aws_lambda blueprint`: https://github.com/cloudtools/stacker_blueprints/blob/master/stacker_blueprints/aws_lambda.py


================================================
FILE: docs/organizations_using_stacker.rst
================================================
===========================
Organizations using stacker
===========================

Below is a list of organizations that currently use stacker in some sense. If
you are using stacker, please submit a PR and add your company below!

Remind_

  Remind helps educators send quick, simple messages to students and parents on
  any device. We believe that when communication improves, relationships get
  stronger. Education gets better. 

  Remind is the original author of stacker, and has been using it to manage the
  infrastructure in multiple environments (including production) since early
  2015.


.. _Remind: https://www.remind.com/

`Onica`_

  Onica is a global technology consulting company at the forefront of 
  cloud computing. Through collaboration with Amazon Web Services, 
  we help customers embrace a broad spectrum of innovative solutions. 
  From migration strategy to operational excellence, cloud native 
  development, and immersive transformation. Onica is a full spectrum 
  AWS integrator.

.. _`Onica`: https://www.onica.com

AltoStack_

  AltoStack is a technology and services consultancy specialising in Cloud
  Consultancy, DevOps, Continuous Delivery and Configuration Management.

  From strategy and operations to culture and technology, AltoStack helps
  businesses identify and address opportunities for growth and profitability.

  We are an Amazon Web Services - (AWS) APN Consulting Partner.

.. _AltoStack: https://altostack.io/

Cobli_

  Cobli develops cutting-edge solutions for fleet management efficiency and
  intelligence in South America. We bring advanced tracking, analysis and
  predictions to fleets of any size by connecting vehicles to an easy to use
  platform through smart devices.

  Cobli manages most of its AWS infrastructure using stacker, and we encourage
  our developers to contribute to free-software whenever possible.

.. _Cobli: https://cobli.co/


================================================
FILE: docs/templates.rst
================================================
==========
Templates
==========

CloudFormation templates can be provided via python Blueprints_ or JSON/YAML.
JSON/YAML templates are specified for stacks via the ``template_path`` config
option (see `Stacks <config.html#stacks>`_).

Jinja2 Templating
=================

Templates with a ``.j2`` extension will be parsed using `Jinja2 
<http://jinja.pocoo.org/>`_. The stacker ``context`` and ``mappings`` objects
and stack ``variables`` objects are available for use in the template:

.. code-block:: yaml

    Description: TestTemplate
    Resources:
      Bucket:
        Type: AWS::S3::Bucket
        Properties:
          BucketName: {{ context.environment.foo }}-{{ variables.myparamname }}


================================================
FILE: docs/terminology.rst
================================================
===========
Terminology
===========

blueprint
=========

.. _blueprints:

A python class that is responsible for creating a CloudFormation template.
Usually this is built using troposphere_.

config
======

A YAML config file that defines the `stack definitions`_ for all of the
stacks you want stacker to manage.

environment
===========

A set of variables that can be used inside the config, allowing you to
slightly adjust configs based on which environment you are launching.

namespace
=========

A way to uniquely identify a stack. Used to determine the naming of many
things, such as the S3 bucket where compiled templates are stored, as well
as the prefix for stack names.

stack definition
================

.. _stack definitions:

Defines the stack_ you want to build, usually there are multiple of these in
the config_. It also defines the variables_ to be used when building the
stack_.

stack
=====

.. _stacks:

The resulting stack of resources that is created by CloudFormation when it
executes a template. Each stack managed by stacker is defined by a
`stack definition`_ in the config_.

output
======

A CloudFormation Template concept. Stacks can output values, allowing easy
access to those values. Often used to export the unique ID's of resources that
templates create. Stacker makes it simple to pull outputs from one stack and
then use them as a variable_ in another stack.

variable
========

.. _variables:

Dynamic variables that are passed into stacks when they are being built.
Variables are defined within the config_.

lookup
======

A method for expanding values in the config_ at build time. By default
lookups are used to reference Output values from other stacks_ within the
same namespace_.

provider
========

Provider that supports provisioning rendered blueprints_. By default, an
AWS provider is used.

context
=======

Context is responsible for translating the values passed in via the
command line and specified in the config_ to stacks_.

.. _troposphere: https://github.com/cloudtools/troposphere
.. _CloudFormation Parameters: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/parameters-section-structure.html


================================================
FILE: docs/translators.rst
================================================
===========
Translators
===========

.. note::
  Translators have been deprecated in favor of `Lookups <lookups.html>`_
  and will be removed in a future release.

Stacker provides the ability to dynamically replace values in the config via a
concept called translators. A translator is meant to take a value and convert
it by calling out to another service or system. This is initially meant to
deal with encrypting fields in your config.

Translators are custom YAML constructors. As an example, if you have a
database and it has a parameter called ``DBPassword`` that you don't want to
store in clear text in your config (maybe because you want to check it into
your version control system to share with the team), you could instead
encrypt the value using ``kms``. For example::

  # We use the aws cli to get the encrypted value for the string
  # "PASSWORD" using the master key called 'myStackerKey' in us-east-1
  $ aws --region us-east-1 kms encrypt --key-id alias/myStackerKey \
      --plaintext "PASSWORD" --output text --query CiphertextBlob

  CiD6bC8t2Y<...encrypted blob...>

  # In stacker we would reference the encrypted value like:
  DBPassword: !kms us-east-1@CiD6bC8t2Y<...encrypted blob...>

  # The above would resolve to
  DBPassword: PASSWORD

This requires that the person using stacker has access to the master key used
to encrypt the value.

It is also possible to store the encrypted blob in a file (useful if the
value is large) using the `file://` prefix, ie::

  DockerConfig: !kms file://dockercfg

.. note::
  Translators resolve the path specified with `file://` relative to
  the location of the config file, not where the stacker command is run.


================================================
FILE: examples/cross-account/.aws/config
================================================
# The master account is like the root of our AWS account tree. It's the
# entrypoint for all other profiles to sts.AssumeRole from.
[profile master]
region = us-east-1
role_arn = arn:aws:iam::<master account id>:role/Stacker
role_session_name = stacker
credential_source = Environment

[profile prod]
region = us-east-1
role_arn = arn:aws:iam::<prod account id>:role/Stacker
role_session_name = stacker
source_profile = master

[profile stage]
region = us-east-1
role_arn = arn:aws:iam::<stage account id>:role/Stacker
role_session_name = stacker
source_profile = master


================================================
FILE: examples/cross-account/README.md
================================================
This is a secure example setup to support cross-account provisioning of stacks with stacker. It:

1. Sets up an appropriate [AWS Config File](https://docs.aws.amazon.com/cli/latest/topic/config-vars.html) in [.aws/config] for stacker to use, with profiles for a "master", "prod" and "stage" AWS account.
2. Configures a stacker bucket in the "master" account, with permissions that allows CloudFormation in "sub" accounts to fetch templates.

## Setup

### Create IAM roles

First things first, we need to create some IAM roles that stacker can assume to make changes in each AWS account. This is generally a manual step after you've created a new AWS account.

In each account, create a new stack using the [stacker-role.yaml](./templates/stacker-role.yaml) CloudFormation template. This will create an IAM role called `Stacker` in the target account, with a trust policy that will allow the `Stacker` role in the master account to `sts:AssumeRole` it.

Once the roles have been created, update the `role_arn`'s in [.aws/config] to match the ones that were just created.

```console
$ aws cloudformation describe-stacks \
  --profile <profile> \
  --stack-name <stack name> \
  --query 'Stacks[0].Outputs' --output text
StackerRole     arn:aws:iam::<account id>:role/Stacker
```

### GetSessionToken

In order for stacker to be able to call `sts:AssumeRole` with the roles we've specified in [.aws/config], we'll need to pass it credentials via environment variables (see [`credential_source = Environment`](./.aws/config)) with appropriate permissions. Generally, the best way to do this is to obtain temporary credentials via the `sts:GetSessionToken` API, while passing an MFA OTP.

Assuming you have an IAM user in your master account, you can get temporary credentials using the AWS CLI:

```console
$ aws sts get-session-token \
  --serial-number arn:aws:iam::<master account id>:mfa/<iam username> \
  --token-code <mfa otp>
```

At Remind, we like to use [aws-vault], which allows us to simplify this to:

```console
$ aws-vault exec default -- env
AWS_VAULT=default
AWS_DEFAULT_REGION=us-east-1
AWS_REGION=us-east-1
AWS_ACCESS_KEY_ID=ASIAJ...ICSXSQ
AWS_SECRET_ACCESS_KEY=4oFx...LSNjpFq
AWS_SESSION_TOKEN=FQoDYXdzED...V6Wrdko2KjW1QU=
AWS_SECURITY_TOKEN=FQoDYXdzED...V6Wrdko2KjW1QU=
```

For the rest of this guide, I'll use `aws-vault` for simplicity.

**NOTE**: You'll need to ensure that this IAM user has access to call `sts:AssumeRole` on the `Stacker` IAM role in the "master" account.

### Bootstrap Stacker Bucket

After we have some IAM roles that stacker can assume, and some temporary credentials, we'll want to create a stacker bucket in the master account, and allow the Stacker roles in sub-accounts access to fetch templates from it.

To do that, first, change the "Roles" variable in [stacker.yaml], then:

```console
$ aws-vault exec default # GetSessionToken + MFA
$ AWS_CONFIG_FILE=.aws/config stacker build --profile master --stacks stacker-bucket stacker.yaml
```

Once the bucket has been created, replace `stacker_bucket` with the name of the bucket in [stacker.yaml].

```console
$ aws cloudformation describe-stacks \
  --profile master \
  --stack-name stacker-bucket \
  --query 'Stacks[0].Outputs' --output text
BucketId     stacker-bucket-1234
```

### Provision stacks

Now that everything is setup, you can add new stacks to your config file, and target them to a specific AWS account using the `profile` option. For example, if I wanted to create a new VPC in both the "production" and "staging" accounts:

```yaml
stacks:
  - name: prod/vpc
    stack_name: vpc
    class_path: stacker_blueprints.vpc.VPC
    profile: prod # target this to the production account
  - name: stage/vpc
    stack_name: vpc
    class_path: stacker_blueprints.vpc.VPC
    profile: stage # target this to the staging account
```

```console
$ AWS_CONFIG_FILE=.aws/config stacker build --profile master stacker.yaml
```

[.aws/config]: ./.aws/config
[stacker.yaml]: ./stacker.yaml
[aws-vault]: https://github.com/99designs/aws-vault


================================================
FILE: examples/cross-account/stacker.yaml
================================================
---
namespace: ''

# We'll set this to an empty string until we've provisioned the
# "stacker-bucket" stack below.
stacker_bucket: ''

stacks:
  # This stack will provision an S3 bucket for stacker to use to upload
  # templates. This will also configure the bucket with a bucket policy
  # allowing CloudFormation in other accounts to fetch templates from it.
  - name: stacker-bucket
    # We're going to "target" this stack in our "master" account.
    profile: master
    template_path: templates/stacker-bucket.yaml
    variables:
      # Change these to the correct AWS account IDs, must be comma seperated list
      Roles: arn:aws:iam::<prod account id>:role/Stacker, arn:aws:iam::<stage account id>:role/Stacker


================================================
FILE: examples/cross-account/templates/stacker-bucket.yaml
================================================
---
AWSTemplateFormatVersion: "2010-09-09"
Description: A bucket for stacker to store CloudFormation templates
Parameters:
  Roles:
    Type: CommaDelimitedList
    Description: A list of IAM roles that will be given read access on the bucket.

Resources:
  StackerBucket:
    Type: AWS::S3::Bucket
    Properties:
      BucketEncryption:
        ServerSideEncryptionConfiguration:
        - ServerSideEncryptionByDefault:
            SSEAlgorithm: AES256

  BucketPolicy:
    Type: AWS::S3::BucketPolicy
    Properties:
      Bucket:
        Ref: StackerBucket
      PolicyDocument:
        Statement:
        - Action:
          - s3:GetObject
          Effect: Allow
          Principal:
            AWS:
              Ref: Roles
          Resource:
          - Fn::Sub: arn:aws:s3:::${StackerBucket}/*

Outputs:
  BucketId:
    Value:
      Ref: StackerBucket


================================================
FILE: examples/cross-account/templates/stacker-role.yaml
================================================
---
AWSTemplateFormatVersion: "2010-09-09"
Description: A role that stacker can assume
Parameters:
  MasterAccountId:
    Type: String
    Description: The 12-digit ID for the master account
    MinLength: 12
    MaxLength: 12
    AllowedPattern: "[0-9]+"
    ConstraintDescription: Must contain a 12 digit account ID
  RoleName:
    Type: String
    Description: The name of the stacker role.
    Default: Stacker


Conditions:
  # Check if we're creating this role in the master account.
  InMasterAccount:
    Fn::Equals:
      - { Ref: "AWS::AccountId" }
      - { Ref: "MasterAccountId" }

Resources:
  StackerRole:
    Type: AWS::IAM::Role
    Properties:
      RoleName:
        Ref: RoleName
      AssumeRolePolicyDocument:
        Version: "2012-10-17"
        Statement:
          Fn::If:
            - InMasterAccount
            - Effect: Allow
              Principal:
                AWS:
                  Fn::Sub: "arn:aws:iam::${MasterAccountId}:root"
              Action: sts:AssumeRole
              Condition:
                'Null':
                  aws:MultiFactorAuthAge: false
            - Effect: Allow
              Principal:
                AWS:
                  Fn::Sub: "arn:aws:iam::${MasterAccountId}:role/${RoleName}"
              Action: sts:AssumeRole
              Condition:
                'Null':
                  aws:MultiFactorAuthAge: false

  # Generally, Stacker will need fairly wide open permissions, since it will be
  # managing all resources in an account.
  StackerPolicies:
    Type: AWS::IAM::Policy
    Properties:
      PolicyName: Stacker
      PolicyDocument:
        Version: "2012-10-17"
        Statement:
          - Effect: Allow
            Action: ["*"]
            Resource: "*"
      Roles:
        - Ref: StackerRole

Outputs:
  StackerRole:
    Value:
      Fn::GetAtt:
        - StackerRole
        - Arn


================================================
FILE: requirements.in
================================================
troposphere>=3.0.0
botocore>=1.12.111
boto3>=1.9.111,<2.0
PyYAML>=3.13b1
awacs>=0.6.0
gitpython>=3.0
jinja2>=2.7
schematics>=2.1.0
formic2
python-dateutil>=2.0,<3.0
MarkupSafe>=2
more-itertools
rsa>=4.7
python-jose
future


================================================
FILE: scripts/compare_env
================================================
#!/usr/bin/env python
""" A script to compare environment files. """

import argparse
import os.path

from stacker.environment import parse_environment


def parse_args():
    parser = argparse.ArgumentParser(description=__doc__)
    parser.add_argument(
        "-i", "--ignore-changed", action="store_true",
        help="Only print added & deleted keys, not changed keys.")
    parser.add_argument(
        "-s", "--show-changes", action="store_true",
        help="Print content changes.")
    parser.add_argument(
        "first_env", type=str,
        help="The first environment file to compare.")
    parser.add_argument(
        "second_env", type=str,
        help="The second environment file to compare.")

    return parser.parse_args()


def parse_env_file(path):
    expanded_path = os.path.expanduser(path)
    with open(expanded_path) as fd:
        return parse_environment(fd.read())


def main():
    args = parse_args()

    first_env = parse_env_file(args.first_env)
    second_env = parse_env_file(args.second_env)

    first_env_keys = set(first_env.keys())
    second_env_keys = set(second_env.keys())

    common_keys = first_env_keys & second_env_keys
    removed_keys = first_env_keys - second_env_keys
    added_keys = second_env_keys - first_env_keys

    changed_keys = set()

    for k in common_keys:
        if first_env[k] != second_env[k]:
            changed_keys.add(k)

    print "-- Added keys:"
    print "  %s" % ", ".join(added_keys)
    print
    print "-- Removed keys:"
    print "  %s" % ", ".join(removed_keys)
    print
    print "-- Changed keys:"
    if not args.show_changes:
        print "  %s" % ", ".join(changed_keys)
    if args.show_changes:
        for k in changed_keys:
            print "  %s:" % (k)
            print "    < %s" % (first_env[k])
            print "    > %s" % (second_env[k])

if __name__ == "__main__":
    main()


================================================
FILE: scripts/docker-stacker
================================================
#!/bin/bash

# This script is meant to be used from within the Docker image for stacker. It
# simply installs the stacks at /stacks and then runs stacker.

set -e

cd /stacks
python setup.py install

exec stacker $@


================================================
FILE: scripts/stacker
================================================
#!/usr/bin/env python

from stacker.logger import setup_logging
from stacker.commands import Stacker

if __name__ == "__main__":
    stacker = Stacker(setup_logging=setup_logging)
    args = stacker.parse_args()
    stacker.configure(args)
    args.run(args)


================================================
FILE: scripts/stacker.cmd
================================================
@echo OFF
REM="""
setlocal
set PythonExe=""
set PythonExeFlags=

for %%i in (cmd bat exe) do (
    for %%j in (python.%%i) do (
        call :SetPythonExe "%%~$PATH:j"
    )
)
for /f "tokens=2 delims==" %%i in ('assoc .py') do (
    for /f "tokens=2 delims==" %%j in ('ftype %%i') do (
        for /f "tokens=1" %%k in ("%%j") do (
            call :SetPythonExe %%k
        )
    )
)
%PythonExe% -x %PythonExeFlags% "%~f0" %*
exit /B %ERRORLEVEL%
goto :EOF

:SetPythonExe
if not ["%~1"]==[""] (
    if [%PythonExe%]==[""] (
        set PythonExe="%~1"
    )
)
goto :EOF
"""

# ===================================================
# Python script starts here
# Above helper adapted from https://github.com/aws/aws-cli/blob/1.11.121/bin/aws.cmd
# ===================================================

#!/usr/bin/env python

from stacker.logger import setup_logging
from stacker.commands import Stacker

if __name__ == "__main__":
    stacker = Stacker(setup_logging=setup_logging)
    args = stacker.parse_args()
    stacker.configure(args)
    args.run(args)


================================================
FILE: setup.cfg
================================================
[metadata]
description-file = README.rst

[aliases]
test = pytest

[tool:pytest]
testpaths = stacker/tests
cov = stacker
filterwarnings =
  ignore::DeprecationWarning


================================================
FILE: setup.py
================================================
import os
from setuptools import setup, find_packages

VERSION = "1.7.2"

src_dir = os.path.dirname(__file__)

def get_install_requirements(path):
    content = open(os.path.join(os.path.dirname(__file__), path)).read()
    return [req for req in content.split("\n") if req != "" and not req.startswith("#")]

install_requires = get_install_requirements("requirements.in")

setup_requires = ['pytest-runner']

tests_require = get_install_requirements("test-requirements.in")

scripts = [
    "scripts/compare_env",
    "scripts/docker-stacker",
    "scripts/stacker.cmd",
    "scripts/stacker",
]


def read(filename):
    full_path = os.path.join(src_dir, filename)
    with open(full_path) as fd:
        return fd.read()


if __name__ == "__main__":
    setup(
        name="stacker",
        version=VERSION,
        author="Michael Barrett",
        author_email="loki77@gmail.com",
        license="New BSD license",
        url="https://github.com/cloudtools/stacker",
        description="AWS CloudFormation Stack manager",
        long_description=read("README.rst"),
        packages=find_packages(),
        scripts=scripts,
        install_requires=install_requires,
        tests_require=tests_require,
        setup_requires=setup_requires,
        extras_require=dict(testing=tests_require),
        classifiers=[
            "Development Status :: 5 - Production/Stable",
            "Environment :: Console",
            "License :: OSI Approved :: BSD License",
            "Programming Language :: Python :: 3.7",
            "Programming Language :: Python :: 3.8",
            "Programming Language :: Python :: 3.9",
            "Programming Language :: Python :: 3.10",
        ],
    )


================================================
FILE: stacker/__init__.py
================================================

__version__ = "1.7.2"


================================================
FILE: stacker/actions/__init__.py
================================================


================================================
FILE: stacker/actions/base.py
================================================
import os
import sys
import logging
import threading

from ..dag import walk, ThreadedWalker, UnlimitedSemaphore
from ..plan import Step, build_plan, build_graph

import botocore.exceptions
from stacker.session_cache import get_session
from stacker.exceptions import PlanFailed

from ..status import (
    COMPLETE
)

from stacker.util import (
    ensure_s3_bucket,
    get_s3_endpoint,
)

logger = logging.getLogger(__name__)

# After submitting a stack update/create, this controls how long we'll wait
# between calls to DescribeStacks to check on it's status. Most stack updates
# take at least a couple minutes, so 30 seconds is pretty reasonable and inline
# with the suggested value in
# https://github.com/boto/botocore/blob/1.6.1/botocore/data/cloudformation/2010-05-15/waiters-2.json#L22
#
# This can be controlled via an environment variable, mostly for testing.
STACK_POLL_TIME = int(os.environ.get("STACKER_STACK_POLL_TIME", 30))


def build_walker(concurrency):
    """This will return a function suitable for passing to
    :class:`stacker.plan.Plan` for walking the graph.

    If concurrency is 1 (no parallelism) this will return a simple topological
    walker that doesn't use any multithreading.

    If concurrency is 0, this will return a walker that will walk the graph as
    fast as the graph topology allows.

    If concurrency is greater than 1, it will return a walker that will only
    execute a maximum of concurrency steps at any given time.

    Returns:
        func: returns a function to walk a :class:`stacker.dag.DAG`.
    """
    if concurrency == 1:
        return walk

    semaphore = UnlimitedSemaphore()
    if concurrency > 1:
        semaphore = threading.Semaphore(concurrency)

    return ThreadedWalker(semaphore).walk


def plan(description, stack_action, context,
         tail=None, reverse=False):
    """A simple helper that builds a graph based plan from a set of stacks.

    Args:
        description (str): a description of the plan.
        action (func): a function to call for each stack.
        context (:class:`stacker.context.Context`): a
            :class:`stacker.context.Context` to build the plan from.
        tail (func): an optional function to call to tail the stack progress.
        reverse (bool): if True, execute the graph in reverse (useful for
            destroy actions).

    Returns:
        :class:`plan.Plan`: The resulting plan object
    """

    def target_fn(*args, **kwargs):
        return COMPLETE

    steps = [
        Step(stack, fn=stack_action, watch_func=tail)
        for stack in context.get_stacks()]

    steps += [
        Step(target, fn=target_fn) for target in context.get_targets()]

    graph = build_graph(steps)

    return build_plan(
        description=description,
        graph=graph,
        targets=context.stack_names,
        reverse=reverse)


def stack_template_key_name(blueprint):
    """Given a blueprint, produce an appropriate key name.

    Args:
        blueprint (:class:`stacker.blueprints.base.Blueprint`): The blueprint
            object to create the key from.

    Returns:
        string: Key name resulting from blueprint.
    """
    name = blueprint.name
    return "stack_templates/%s/%s-%s.json" % (blueprint.context.get_fqn(name),
                                              name,
                                              blueprint.version)


def stack_template_url(bucket_name, blueprint, endpoint):
    """Produces an s3 url for a given blueprint.

    Args:
        bucket_name (string): The name of the S3 bucket where the resulting
            templates are stored.
        blueprint (:class:`stacker.blueprints.base.Blueprint`): The blueprint
            object to create the URL to.
        endpoint (string): The s3 endpoint used for the bucket.

    Returns:
        string: S3 URL.
    """
    key_name = stack_template_key_name(blueprint)
    return "%s/%s/%s" % (endpoint, bucket_name, key_name)


class BaseAction(object):

    """Actions perform the actual work of each Command.

    Each action is tied to a :class:`stacker.commands.base.BaseCommand`, and
    is responsible for building the :class:`stacker.plan.Plan` that will be
    executed to perform that command.

    Args:
        context (:class:`stacker.context.Context`): The stacker context for
            the current run.
        provider_builder (:class:`stacker.providers.base.BaseProviderBuilder`,
            optional): An object that will build a provider that will be
            interacted with in order to perform the necessary actions.
    """

    def __init__(self, context, provider_builder=None, cancel=None):
        self.context = context
        self.provider_builder = provider_builder
        self.bucket_name = context.bucket_name
        self.cancel = cancel or threading.Event()
        self.bucket_region = context.config.stacker_bucket_region
        if not self.bucket_region and provider_builder:
            self.bucket_region = provider_builder.region
        self.s3_conn = get_session(self.bucket_region).client('s3')

    def ensure_cfn_bucket(self):
        """The CloudFormation bucket where templates will be stored."""
        if self.bucket_name:
            ensure_s3_bucket(self.s3_conn,
                             self.bucket_name,
                             self.bucket_region)

    def stack_template_url(self, blueprint):
        return stack_template_url(
            self.bucket_name, blueprint, get_s3_endpoint(self.s3_conn)
        )

    def s3_stack_push(self, blueprint, force=False):
        """Pushes the rendered blueprint's template to S3.

        Verifies that the template doesn't already exist in S3 before
        pushing.

        Returns the URL to the template in S3.
        """
        key_name = stack_template_key_name(blueprint)
        template_url = self.stack_template_url(blueprint)
        try:
            template_exists = self.s3_conn.head_object(
                Bucket=self.bucket_name, Key=key_name) is not None
        except botocore.exceptions.ClientError as e:
            if e.response['Error']['Code'] == '404':
                template_exists = False
            else:
                raise

        if template_exists and not force:
            logger.debug("Cloudformation template %s already exists.",
                         template_url)
            return template_url
        self.s3_conn.put_object(Bucket=self.bucket_name,
                                Key=key_name,
                                Body=blueprint.rendered,
                                ServerSideEncryption='AES256',
                                ACL='bucket-owner-full-control')
        logger.debug("Blueprint %s pushed to %s.", blueprint.name,
                     template_url)
        return template_url

    def execute(self, *args, **kwargs):
        try:
            self.pre_run(*args, **kwargs)
            self.run(*args, **kwargs)
            self.post_run(*args, **kwargs)
        except PlanFailed as e:
            logger.error(str(e))
            sys.exit(1)

    def pre_run(self, *args, **kwargs):
        pass

    def run(self, *args, **kwargs):
        raise NotImplementedError("Subclass must implement \"run\" method")

    def post_run(self, *args, **kwargs):
        pass

    def build_provider(self, stack):
        """Builds a :class:`stacker.providers.base.Provider` suitable for
        operating on the given :class:`stacker.Stack`."""
        return self.provider_builder.build(region=stack.region,
                                           profile=stack.profile)

    @property
    def provider(self):
        """Some actions need a generic provider using the default region (e.g.
        hooks)."""
        return self.provider_builder.build()

    def _tail_stack(self, stack, cancel, retries=0, **kwargs):
        provider = self.build_provider(stack)
        return provider.tail_stack(stack, cancel, retries, **kwargs)


================================================
FILE: stacker/actions/build.py
================================================
import logging

from .base import BaseAction, plan, build_walker
from .base import STACK_POLL_TIME

from ..providers.base import Template
from stacker.hooks import utils
from ..exceptions import (
    MissingParameterException,
    StackDidNotChange,
    StackDoesNotExist,
    CancelExecution,
)

from ..status import (
    NotSubmittedStatus,
    NotUpdatedStatus,
    DidNotChangeStatus,
    SubmittedStatus,
    CompleteStatus,
    FailedStatus,
    SkippedStatus,
    PENDING,
    WAITING,
    SUBMITTED,
    INTERRUPTED
)


logger = logging.getLogger(__name__)


def build_stack_tags(stack):
    """Builds a common set of tags to attach to a stack"""
    return [{'Key': t[0], 'Value': t[1]} for t in stack.tags.items()]


def should_update(stack):
    """Tests whether a stack should be submitted for updates to CF.

    Args:
        stack (:class:`stacker.stack.Stack`): The stack object to check.

    Returns:
        bool: If the stack should be updated, return True.

    """
    if stack.locked:
        if not stack.force:
            logger.debug("Stack %s locked and not in --force list. "
                         "Refusing to update.", stack.name)
            return False
        else:
            logger.debug("Stack %s locked, but is in --force "
                         "list.", stack.name)
    return True


def should_submit(stack):
    """Tests whether a stack should be submitted to CF for update/create

    Args:
        stack (:class:`stacker.stack.Stack`): The stack object to check.

    Returns:
        bool: If the stack should be submitted, return True.

    """
    if stack.enabled:
        return True

    logger.debug("Stack %s is not enabled.  Skipping.", stack.name)
    return False


def should_ensure_cfn_bucket(outline, dump):
    """Test whether access to the cloudformation template bucket is required

    Args:
        outline (bool): The outline action.
        dump (bool): The dump action.

    Returns:
        bool: If access to CF bucket is needed, return True.

    """
    return not outline and not dump


def _resolve_parameters(parameters, blueprint):
    """Resolves CloudFormation Parameters for a given blueprint.

    Given a list of parameters, handles:
        - discard any parameters that the blueprint does not use
        - discard any empty values
        - convert booleans to strings suitable for CloudFormation

    Args:
        parameters (dict): A dictionary of parameters provided by the
            stack definition
        blueprint (:class:`stacker.blueprint.base.Blueprint`): A Blueprint
            object that is having the parameters applied to it.

    Returns:
        dict: The resolved parameters.

    """
    params = {}
    param_defs = blueprint.get_parameter_definitions()

    for key, value in parameters.items():
        if key not in param_defs:
            logger.debug("Blueprint %s does not use parameter %s.",
                         blueprint.name, key)
            continue
        if value is None:
            logger.debug("Got None value for parameter %s, not submitting it "
                         "to cloudformation, default value should be used.",
                         key)
            continue
        if isinstance(value, bool):
            logger.debug("Converting parameter %s boolean \"%s\" to string.",
                         key, value)
            value = str(value).lower()
        params[key] = value
    return params


class UsePreviousParameterValue(object):
    """ A simple class used to indicate a Parameter should use it's existng
    value.
    """
    pass


def _handle_missing_parameters(parameter_values, all_params, required_params,
                               existing_stack=None):
    """Handles any missing parameters.

    If an existing_stack is provided, look up missing parameters there.

    Args:
        parameter_values (dict): key/value dictionary of stack definition
            parameters
        all_params (list): A list of all the parameters used by the
            template/blueprint.
        required_params (list): A list of all the parameters required by the
            template/blueprint.
        existing_stack (dict): A dict representation of the stack. If
            provided, will be searched for any missing parameters.

    Returns:
        list of tuples: The final list of key/value pairs returned as a
            list of tuples.

    Raises:
        MissingParameterException: Raised if a required parameter is
            still missing.

    """
    missing_params = list(set(all_params) - set(parameter_values.keys()))
    if existing_stack and 'Parameters' in existing_stack:
        stack_parameters = [
            p["ParameterKey"] for p in existing_stack["Parameters"]
        ]
        for p in missing_params:
            if p in stack_parameters:
                logger.debug(
                    "Using previous value for parameter %s from existing "
                    "stack",
                    p
                )
                parameter_values[p] = UsePreviousParameterValue
    final_missing = list(set(required_params) - set(parameter_values.keys()))
    if final_missing:
        raise MissingParameterException(final_missing)

    return list(parameter_values.items())


def handle_hooks(stage, hooks, provider, context, dump, outline):
    """Handle pre/post hooks.

    Args:
        stage (str): The name of the hook stage - pre_build/post_build.
        hooks (list): A list of dictionaries containing the hooks to execute.
        provider (:class:`stacker.provider.base.BaseProvider`): The provider
            the current stack is using.
        context (:class:`stacker.context.Context`): The current stacker
            context.
        dump (bool): Whether running with dump set or not.
        outline (bool): Whether running with outline set or not.

    """
    if not outline and not dump and hooks:
        utils.handle_hooks(
            stage=stage,
            hooks=hooks,
            provider=provider,
            context=context
        )


class Action(BaseAction):
    """Responsible for building & coordinating CloudFormation stacks.

    Generates the build plan based on stack dependencies (these dependencies
    are determined automatically based on output lookups from other stacks).

    The plan can then either be printed out as an outline or executed. If
    executed, each stack will get launched in order which entails:

        - Pushing the generated CloudFormation template to S3 if it has changed
        - Submitting either a build or update of the given stack to the
            :class:`stacker.provider.base.Provider`.

    """

    def build_parameters(self, stack, provider_stack=None):
        """Builds the CloudFormation Parameters for our stack.

        Args:
            stack (:class:`stacker.stack.Stack`): A stacker stack
            provider_stack (dict): An optional Stacker provider object

        Returns:
            dict: The parameters for the given stack

        """
        resolved = _resolve_parameters(stack.parameter_values, stack.blueprint)
        required_parameters = list(stack.required_parameter_definitions)
        all_parameters = list(stack.all_parameter_definitions)
        parameters = _handle_missing_parameters(resolved, all_parameters,
                                                required_parameters,
                                                provider_stack)

        param_list = []

        for key, value in parameters:
            param_dict = {"ParameterKey": key}
            if value is UsePreviousParameterValue:
                param_dict["UsePreviousValue"] = True
            else:
                param_dict["ParameterValue"] = str(value)

            param_list.append(param_dict)

        return param_list

    def _launch_stack(self, stack, **kwargs):
        """Handles the creating or updating of a stack in CloudFormation.

        Also makes sure that we don't try to create or update a stack while
        it is already updating or creating.

        """
        old_status = kwargs.get("status")
        wait_time = 0 if old_status is PENDING else STACK_POLL_TIME
        if self.cancel.wait(wait_time):
            return INTERRUPTED

        if not should_submit(stack):
            return NotSubmittedStatus()

        provider = self.build_provider(stack)

        try:
            provider_stack = provider.get_stack(stack.fqn)
        except StackDoesNotExist:
            provider_stack = None

        if provider_stack and not should_update(stack):
            stack.set_outputs(
                self.provider.get_output_dict(provider_stack))
            return NotUpdatedStatus()

        recreate = False
        if provider_stack and old_status == SUBMITTED:
            logger.debug(
                "Stack %s provider status: %s",
                stack.fqn,
                provider.get_stack_status(provider_stack),
            )

            if provider.is_stack_rolling_back(provider_stack):
                if 'rolling back' in old_status.reason:
                    return old_status

                logger.debug("Stack %s entered a roll back", stack.fqn)
                if 'updating' in old_status.reason:
                    reason = 'rolling back update'
                else:
                    reason = 'rolling back new stack'

                return SubmittedStatus(reason)
            elif provider.is_stack_in_progress(provider_stack):
                logger.debug("Stack %s in progress.", stack.fqn)
                return old_status
            elif provider.is_stack_destroyed(provider_stack):
                logger.debug("Stack %s finished deleting", stack.fqn)
                recreate = True
                # Continue with creation afterwards
            # Failure must be checked *before* completion, as both will be true
            # when completing a rollback, and we don't want to consider it as
            # a successful update.
            elif provider.is_stack_failed(provider_stack):
                reason = old_status.reason
                if 'rolling' in reason:
                    reason = reason.replace('rolling', 'rolled')
                status_reason = provider.get_rollback_status_reason(stack.fqn)
                logger.info(
                    "%s Stack Roll Back Reason: " + status_reason, stack.fqn)
                return FailedStatus(reason)

            elif provider.is_stack_completed(provider_stack):
                stack.set_outputs(
                    provider.get_output_dict(provider_stack))
                return CompleteStatus(old_status.reason)
            else:
                return old_status

        logger.debug("Resolving stack %s", stack.fqn)
        stack.resolve(self.context, self.provider)

        logger.debug("Launching stack %s now.", stack.fqn)
        template = self._template(stack.blueprint)
        stack_policy = self._stack_policy(stack)
        tags = build_stack_tags(stack)
        parameters = self.build_parameters(stack, provider_stack)
        force_change_set = stack.blueprint.requires_change_set

        if recreate:
            logger.debug("Re-creating stack: %s", stack.fqn)
            provider.create_stack(stack.fqn, template, parameters,
                                  tags, stack_policy=stack_policy)
            return SubmittedStatus("re-creating stack")
        elif not provider_stack:
            logger.debug("Creating new stack: %s", stack.fqn)
            provider.create_stack(stack.fqn, template, parameters, tags,
                                  force_change_set,
                                  stack_policy=stack_policy,
                                  notification_arns=stack.notification_arns)
            return SubmittedStatus("creating new stack")

        try:
            wait = stack.in_progress_behavior == "wait"
            if wait and provider.is_stack_in_progress(provider_stack):
                return WAITING
            if provider.prepare_stack_for_update(provider_stack, tags):
                existing_params = provider_stack.get('Parameters', [])
                provider.update_stack(
                    stack.fqn,
                    template,
                    existing_params,
                    parameters,
                    tags,
                    force_interactive=stack.protected,
                    force_change_set=force_change_set,
                    stack_policy=stack_policy,
                    notification_arns=stack.notification_arns
                )

                logger.debug("Updating existing stack: %s", stack.fqn)
                return SubmittedStatus("updating existing stack")
            else:
                return SubmittedStatus("destroying stack for re-creation")
        except CancelExecution:
            stack.set_outputs(provider.get_output_dict(provider_stack))
            return SkippedStatus(reason="canceled execution")
        except StackDidNotChange:
            stack.set_outputs(provider.get_output_dict(provider_stack))
            return DidNotChangeStatus()

    def _template(self, blueprint):
        """Generates a suitable template based on whether or not an S3 bucket
        is set.

        If an S3 bucket is set, then the template will be uploaded to S3 first,
        and CreateStack/UpdateStack operations will use the uploaded template.
        If not bucket is set, then the template will be inlined.
        """
        if self.bucket_name:
            return Template(url=self.s3_stack_push(blueprint))
        else:
            return Template(body=blueprint.rendered)

    def _stack_policy(self, stack):
        """Returns a Template object for the stacks stack policy, or None if
        the stack doesn't have a stack policy."""
        if stack.stack_policy:
            return Template(body=stack.stack_policy)

    def _generate_plan(self, tail=False):
        return plan(
            description="Create/Update stacks",
            stack_action=self._launch_stack,
            tail=self._tail_stack if tail else None,
            context=self.context)

    def pre_run(self, outline=False, dump=False, *args, **kwargs):
        """Any steps that need to be taken prior to running the action."""
        if should_ensure_cfn_bucket(outline, dump):
            self.ensure_cfn_bucket()
        hooks = self.context.config.pre_build
        handle_hooks(
            "pre_build",
            hooks,
            self.provider,
            self.context,
            dump,
            outline
        )

    def run(self, concurrency=0, outline=False,
            tail=False, dump=False, *args, **kwargs):
        """Kicks off the build/update of the stacks in the stack_definitions.

        This is the main entry point for the Builder.

        """
        plan = self._generate_plan(tail=tail)
        if not plan.keys():
            logger.warn('WARNING: No stacks detected (error in config?)')
        if not outline and not dump:
            plan.outline(logging.DEBUG)
            logger.debug("Launching stacks: %s", ", ".join(plan.keys()))
            walker = build_walker(concurrency)
            plan.execute(walker)
        else:
            if outline:
                plan.outline()
            if dump:
                plan.dump(directory=dump, context=self.context,
                          provider=self.provider)

    def post_run(self, outline=False, dump=False, *args, **kwargs):
        """Any steps that need to be taken after running the action."""
        hooks = self.context.config.post_build
        handle_hooks(
            "post_build",
            hooks,
            self.provider,
            self.context,
            dump,
            outline
        )


================================================
FILE: stacker/actions/destroy.py
================================================
import logging

from .base import BaseAction, plan, build_walker
from .base import STACK_POLL_TIME
from ..exceptions import StackDoesNotExist
from stacker.hooks.utils import handle_hooks
from ..status import (
    CompleteStatus,
    SubmittedStatus,
    PENDING,
    SUBMITTED,
    INTERRUPTED
)

from ..status import StackDoesNotExist as StackDoesNotExistStatus

logger = logging.getLogger(__name__)

DestroyedStatus = CompleteStatus("stack destroyed")
DestroyingStatus = SubmittedStatus("submitted for destruction")


class Action(BaseAction):
    """Responsible for destroying CloudFormation stacks.

    Generates a destruction plan based on stack dependencies. Stack
    dependencies are reversed from the build action. For example, if a Stack B
    requires Stack A during build, during destroy Stack A requires Stack B be
    destroyed first.

    The plan defaults to printing an outline of what will be destroyed. If
    forced to execute, each stack will get destroyed in order.

    """

    def _generate_plan(self, tail=False):
        return plan(
            description="Destroy stacks",
            stack_action=self._destroy_stack,
            tail=self._tail_stack if tail else None,
            context=self.context,
            reverse=True)

    def _destroy_stack(self, stack, **kwargs):
        old_status = kwargs.get("status")
        wait_time = 0 if old_status is PENDING else STACK_POLL_TIME
        if self.cancel.wait(wait_time):
            return INTERRUPTED

        provider = self.build_provider(stack)

        try:
            provider_stack = provider.get_stack(stack.fqn)
        except StackDoesNotExist:
            logger.debug("Stack %s does not exist.", stack.fqn)
            # Once the stack has been destroyed, it doesn't exist. If the
            # status of the step was SUBMITTED, we know we just deleted it,
            # otherwise it should be skipped
            if kwargs.get("status", None) == SUBMITTED:
                return DestroyedStatus
            else:
                return StackDoesNotExistStatus()

        logger.debug(
            "Stack %s provider status: %s",
            provider.get_stack_name(provider_stack),
            provider.get_stack_status(provider_stack),
        )
        if provider.is_stack_destroyed(provider_stack):
            return DestroyedStatus
        elif provider.is_stack_in_progress(provider_stack):
            return DestroyingStatus
        else:
            logger.debug("Destroying stack: %s", stack.fqn)
            provider.destroy_stack(provider_stack)
        return DestroyingStatus

    def pre_run(self, outline=False, *args, **kwargs):
        """Any steps that need to be taken prior to running the action."""
        pre_destroy = self.context.config.pre_destroy
        if not outline and pre_destroy:
            handle_hooks(
                stage="pre_destroy",
                hooks=pre_destroy,
                provider=self.provider,
                context=self.context)

    def run(self, force, concurrency=0, tail=False, *args, **kwargs):
        plan = self._generate_plan(tail=tail)
        if not plan.keys():
            logger.warn('WARNING: No stacks detected (error in config?)')
        if force:
            # need to generate a new plan to log since the outline sets the
            # steps to COMPLETE in order to log them
            plan.outline(logging.DEBUG)
            walker = build_walker(concurrency)
            plan.execute(walker)
        else:
            plan.outline(message="To execute this plan, run with \"--force\" "
                                 "flag.")

    def post_run(self, outline=False, *args, **kwargs):
        """Any steps that need to be taken after running the action."""
        post_destroy = self.context.config.post_destroy
        if not outline and post_destroy:
            handle_hooks(
                stage="post_destroy",
                hooks=post_destroy,
                provider=self.provider,
                context=self.context)


================================================
FILE: stacker/actions/diff.py
================================================
import logging
from operator import attrgetter

from .base import plan, build_walker
from . import build
from .. import exceptions
from ..status import (
    NotSubmittedStatus,
    NotUpdatedStatus,
    COMPLETE,
    INTERRUPTED,
)

logger = logging.getLogger(__name__)


class DictValue(object):
    ADDED = "ADDED"
    REMOVED = "REMOVED"
    MODIFIED = "MODIFIED"
    UNMODIFIED = "UNMODIFIED"

    formatter = "%s%s = %s"

    def __init__(self, key, old_value, new_value):
        self.key = key
        self.old_value = old_value
        self.new_value = new_value

    def __eq__(self, other):
        return self.__dict__ == other.__dict__

    def changes(self):
        """Returns a list of changes to represent the diff between
        old and new value.

        Returns:
            list: [string] representation of the change (if any)
                between old and new value
        """
        output = []
        if self.status() is self.UNMODIFIED:
            output = [self.formatter % (' ', self.key, self.old_value)]
        elif self.status() is self.ADDED:
            output.append(self.formatter % ('+', self.key, self.new_value))
        elif self.status() is self.REMOVED:
            output.append(self.formatter % ('-', self.key, self.old_value))
        elif self.status() is self.MODIFIED:
            output.append(self.formatter % ('-', self.key, self.old_value))
            output.append(self.formatter % ('+', self.key, self.new_value))
        return output

    def status(self):
        if self.old_value == self.new_value:
            return self.UNMODIFIED
        elif self.old_value is None:
            return self.ADDED
        elif self.new_value is None:
            return self.REMOVED
        else:
            return self.MODIFIED


def diff_dictionaries(old_dict, new_dict):
    """Diffs two single dimension dictionaries

    Returns the number of changes and an unordered list
    expressing the common entries and changes.

    Args:
        old_dict(dict): old dictionary
        new_dict(dict): new dictionary

    Returns: list()
        int: number of changed records
        list: [DictValue]
    """

    old_set = set(old_dict)
    new_set = set(new_dict)

    added_set = new_set - old_set
    removed_set = old_set - new_set
    common_set = old_set & new_set

    changes = 0
    output = []
    for key in added_set:
        changes += 1
        output.append(DictValue(key, None, new_dict[key]))

    for key in removed_set:
        changes += 1
        output.append(DictValue(key, old_dict[key], None))

    for key in common_set:
        output.append(DictValue(key, old_dict[key], new_dict[key]))
        if str(old_dict[key]) != str(new_dict[key]):
            changes += 1

    output.sort(key=
Download .txt
gitextract_put1k2j_/

├── .circleci/
│   └── config.yml
├── .dockerignore
├── .gitignore
├── AUTHORS.rst
├── CHANGELOG.md
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── Dockerfile
├── LICENSE
├── Makefile
├── README.rst
├── RELEASE.md
├── codecov.yml
├── conf/
│   └── README.rst
├── docs/
│   ├── .gitignore
│   ├── Makefile
│   ├── api/
│   │   ├── modules.rst
│   │   ├── stacker.actions.rst
│   │   ├── stacker.blueprints.rst
│   │   ├── stacker.blueprints.variables.rst
│   │   ├── stacker.commands.rst
│   │   ├── stacker.commands.stacker.rst
│   │   ├── stacker.config.rst
│   │   ├── stacker.config.translators.rst
│   │   ├── stacker.hooks.rst
│   │   ├── stacker.logger.rst
│   │   ├── stacker.lookups.handlers.rst
│   │   ├── stacker.lookups.rst
│   │   ├── stacker.providers.aws.rst
│   │   ├── stacker.providers.rst
│   │   └── stacker.rst
│   ├── blueprints.rst
│   ├── commands.rst
│   ├── conf.py
│   ├── config.rst
│   ├── environments.rst
│   ├── index.rst
│   ├── lookups.rst
│   ├── organizations_using_stacker.rst
│   ├── templates.rst
│   ├── terminology.rst
│   └── translators.rst
├── examples/
│   └── cross-account/
│       ├── .aws/
│       │   └── config
│       ├── README.md
│       ├── stacker.yaml
│       └── templates/
│           ├── stacker-bucket.yaml
│           └── stacker-role.yaml
├── requirements.in
├── scripts/
│   ├── compare_env
│   ├── docker-stacker
│   ├── stacker
│   └── stacker.cmd
├── setup.cfg
├── setup.py
├── stacker/
│   ├── __init__.py
│   ├── actions/
│   │   ├── __init__.py
│   │   ├── base.py
│   │   ├── build.py
│   │   ├── destroy.py
│   │   ├── diff.py
│   │   ├── graph.py
│   │   └── info.py
│   ├── awscli_yamlhelper.py
│   ├── blueprints/
│   │   ├── __init__.py
│   │   ├── base.py
│   │   ├── raw.py
│   │   ├── testutil.py
│   │   └── variables/
│   │       ├── __init__.py
│   │       └── types.py
│   ├── commands/
│   │   ├── __init__.py
│   │   └── stacker/
│   │       ├── __init__.py
│   │       ├── base.py
│   │       ├── build.py
│   │       ├── destroy.py
│   │       ├── diff.py
│   │       ├── graph.py
│   │       └── info.py
│   ├── config/
│   │   ├── __init__.py
│   │   └── translators/
│   │       ├── __init__.py
│   │       └── kms.py
│   ├── context.py
│   ├── dag/
│   │   └── __init__.py
│   ├── environment.py
│   ├── exceptions.py
│   ├── hooks/
│   │   ├── __init__.py
│   │   ├── aws_lambda.py
│   │   ├── command.py
│   │   ├── ecs.py
│   │   ├── iam.py
│   │   ├── keypair.py
│   │   ├── route53.py
│   │   └── utils.py
│   ├── logger/
│   │   └── __init__.py
│   ├── lookups/
│   │   ├── __init__.py
│   │   ├── handlers/
│   │   │   ├── __init__.py
│   │   │   ├── ami.py
│   │   │   ├── default.py
│   │   │   ├── dynamodb.py
│   │   │   ├── envvar.py
│   │   │   ├── file.py
│   │   │   ├── hook_data.py
│   │   │   ├── kms.py
│   │   │   ├── output.py
│   │   │   ├── rxref.py
│   │   │   ├── split.py
│   │   │   ├── ssmstore.py
│   │   │   └── xref.py
│   │   └── registry.py
│   ├── plan.py
│   ├── providers/
│   │   ├── __init__.py
│   │   ├── aws/
│   │   │   ├── __init__.py
│   │   │   └── default.py
│   │   └── base.py
│   ├── session_cache.py
│   ├── stack.py
│   ├── status.py
│   ├── target.py
│   ├── tests/
│   │   ├── __init__.py
│   │   ├── actions/
│   │   │   ├── __init__.py
│   │   │   ├── test_base.py
│   │   │   ├── test_build.py
│   │   │   ├── test_destroy.py
│   │   │   └── test_diff.py
│   │   ├── blueprints/
│   │   │   ├── __init__.py
│   │   │   ├── test_base.py
│   │   │   ├── test_raw.py
│   │   │   └── test_testutil.py
│   │   ├── conftest.py
│   │   ├── factories.py
│   │   ├── fixtures/
│   │   │   ├── __init__.py
│   │   │   ├── basic.env
│   │   │   ├── cfn_template.json
│   │   │   ├── cfn_template.json.j2
│   │   │   ├── cfn_template.yaml
│   │   │   ├── keypair/
│   │   │   │   ├── fingerprint
│   │   │   │   ├── id_rsa
│   │   │   │   └── id_rsa.pub
│   │   │   ├── mock_blueprints.py
│   │   │   ├── mock_hooks.py
│   │   │   ├── mock_lookups.py
│   │   │   ├── not-basic.env
│   │   │   ├── parameter_resolution/
│   │   │   │   └── template.yml
│   │   │   ├── vpc-bastion-db-web-pre-1.0.yaml
│   │   │   ├── vpc-bastion-db-web.yaml
│   │   │   └── vpc-custom-log-format-info.yaml
│   │   ├── hooks/
│   │   │   ├── __init__.py
│   │   │   ├── test_aws_lambda.py
│   │   │   ├── test_command.py
│   │   │   ├── test_ecs.py
│   │   │   ├── test_iam.py
│   │   │   └── test_keypair.py
│   │   ├── lookups/
│   │   │   ├── __init__.py
│   │   │   ├── handlers/
│   │   │   │   ├── __init__.py
│   │   │   │   ├── test_ami.py
│   │   │   │   ├── test_default.py
│   │   │   │   ├── test_dynamodb.py
│   │   │   │   ├── test_envvar.py
│   │   │   │   ├── test_file.py
│   │   │   │   ├── test_hook_data.py
│   │   │   │   ├── test_output.py
│   │   │   │   ├── test_rxref.py
│   │   │   │   ├── test_split.py
│   │   │   │   ├── test_ssmstore.py
│   │   │   │   └── test_xref.py
│   │   │   └── test_registry.py
│   │   ├── providers/
│   │   │   ├── __init__.py
│   │   │   └── aws/
│   │   │       ├── __init__.py
│   │   │       └── test_default.py
│   │   ├── test_config.py
│   │   ├── test_context.py
│   │   ├── test_dag.py
│   │   ├── test_environment.py
│   │   ├── test_lookups.py
│   │   ├── test_parse_user_data.py
│   │   ├── test_plan.py
│   │   ├── test_stack.py
│   │   ├── test_stacker.py
│   │   ├── test_util.py
│   │   └── test_variables.py
│   ├── tokenize_userdata.py
│   ├── ui.py
│   ├── util.py
│   └── variables.py
├── test-requirements.in
└── tests/
    ├── Makefile
    ├── README.md
    ├── cleanup_functional_test_buckets.sh
    ├── fixtures/
    │   ├── blueprints/
    │   │   └── test_repo.json
    │   └── stack_policies/
    │       ├── default.json
    │       └── none.json
    ├── run_test_suite.sh
    ├── stacker.yaml.sh
    ├── test_helper.bash
    └── test_suite/
        ├── 01_stacker_build_no_config.bats
        ├── 02_stacker_build_empty_config.bats
        ├── 03_stacker_build-config_with_no_stacks.bats
        ├── 04_stacker_build-config_with_no_namespace.bats
        ├── 05_stacker_build-missing_environment_key.bats
        ├── 06_stacker_build-duplicate_stacks.bats
        ├── 07_stacker_graph-json_format.bats
        ├── 08_stacker_graph-dot_format.bats
        ├── 09_stacker_build-missing_variable.bats
        ├── 10_stacker_build-simple_build.bats
        ├── 11_stacker_info-simple_info.bats
        ├── 12_stacker_build-simple_build_with_output_lookups.bats
        ├── 13_stacker_build-simple_build_with_environment.bats
        ├── 14_stacker_build-interactive_with_skipped_update.bats
        ├── 15_stacker_build-no_namespace.bats
        ├── 16_stacker_build-overriden_environment_key_with_-e.bats
        ├── 17_stacker_build-dump.bats
        ├── 18_stacker_diff-simple_diff_with_output_lookups.bats
        ├── 19_stacker_build-replacements-only_test_with_additional_resource_no_keyerror.bats
        ├── 20_stacker_build-locked_stacks.bats
        ├── 21_stacker_build-default_mode_without_&_with_protected_stack.bats
        ├── 22_stacker_build-recreate_failed_stack_non-interactive_mode.bats
        ├── 23_stacker_build-recreate_failed_stack_interactive_mode.bats
        ├── 24_stacker_build-handle_rollbacks_during_updates.bats
        ├── 25_stacker_build-handle_rollbacks_in_dependent_stacks.bats
        ├── 26_stacker_build-raw_template.bats
        ├── 27_stacker_diff-raw_template.bats
        ├── 28_stacker_build-raw_template_parameter_resolution.bats
        ├── 29_stacker_build-no_parallelism.bats
        ├── 30_stacker_build-tailing.bats
        ├── 31_stacker_build-override_stack_name.bats
        ├── 32_stacker_build-multi_region.bats
        └── 33_stacker_build-profiles.bats
Download .txt
SYMBOL INDEX (1202 symbols across 100 files)

FILE: setup.py
  function get_install_requirements (line 8) | def get_install_requirements(path):
  function read (line 26) | def read(filename):

FILE: stacker/actions/base.py
  function build_walker (line 34) | def build_walker(concurrency):
  function plan (line 60) | def plan(description, stack_action, context,
  function stack_template_key_name (line 96) | def stack_template_key_name(blueprint):
  function stack_template_url (line 112) | def stack_template_url(bucket_name, blueprint, endpoint):
  class BaseAction (line 129) | class BaseAction(object):
    method __init__ (line 145) | def __init__(self, context, provider_builder=None, cancel=None):
    method ensure_cfn_bucket (line 155) | def ensure_cfn_bucket(self):
    method stack_template_url (line 162) | def stack_template_url(self, blueprint):
    method s3_stack_push (line 167) | def s3_stack_push(self, blueprint, force=False):
    method execute (line 199) | def execute(self, *args, **kwargs):
    method pre_run (line 208) | def pre_run(self, *args, **kwargs):
    method run (line 211) | def run(self, *args, **kwargs):
    method post_run (line 214) | def post_run(self, *args, **kwargs):
    method build_provider (line 217) | def build_provider(self, stack):
    method provider (line 224) | def provider(self):
    method _tail_stack (line 229) | def _tail_stack(self, stack, cancel, retries=0, **kwargs):

FILE: stacker/actions/build.py
  function build_stack_tags (line 33) | def build_stack_tags(stack):
  function should_update (line 38) | def should_update(stack):
  function should_submit (line 59) | def should_submit(stack):
  function should_ensure_cfn_bucket (line 76) | def should_ensure_cfn_bucket(outline, dump):
  function _resolve_parameters (line 90) | def _resolve_parameters(parameters, blueprint):
  class UsePreviousParameterValue (line 129) | class UsePreviousParameterValue(object):
  function _handle_missing_parameters (line 136) | def _handle_missing_parameters(parameter_values, all_params, required_pa...
  function handle_hooks (line 181) | def handle_hooks(stage, hooks, provider, context, dump, outline):
  class Action (line 204) | class Action(BaseAction):
    method build_parameters (line 219) | def build_parameters(self, stack, provider_stack=None):
    method _launch_stack (line 250) | def _launch_stack(self, stack, **kwargs):
    method _template (line 374) | def _template(self, blueprint):
    method _stack_policy (line 387) | def _stack_policy(self, stack):
    method _generate_plan (line 393) | def _generate_plan(self, tail=False):
    method pre_run (line 400) | def pre_run(self, outline=False, dump=False, *args, **kwargs):
    method run (line 414) | def run(self, concurrency=0, outline=False,
    method post_run (line 436) | def post_run(self, outline=False, dump=False, *args, **kwargs):

FILE: stacker/actions/destroy.py
  class Action (line 23) | class Action(BaseAction):
    method _generate_plan (line 36) | def _generate_plan(self, tail=False):
    method _destroy_stack (line 44) | def _destroy_stack(self, stack, **kwargs):
    method pre_run (line 78) | def pre_run(self, outline=False, *args, **kwargs):
    method run (line 88) | def run(self, force, concurrency=0, tail=False, *args, **kwargs):
    method post_run (line 102) | def post_run(self, outline=False, *args, **kwargs):

FILE: stacker/actions/diff.py
  class DictValue (line 17) | class DictValue(object):
    method __init__ (line 25) | def __init__(self, key, old_value, new_value):
    method __eq__ (line 30) | def __eq__(self, other):
    method changes (line 33) | def changes(self):
    method status (line 53) | def status(self):
  function diff_dictionaries (line 64) | def diff_dictionaries(old_dict, new_dict):
  function format_params_diff (line 105) | def format_params_diff(parameter_diff):
  function diff_parameters (line 124) | def diff_parameters(old_params, new_params):
  class Action (line 142) | class Action(build.Action):
    method _diff_stack (line 153) | def _diff_stack(self, stack, **kwargs):
    method _generate_plan (line 183) | def _generate_plan(self):
    method run (line 189) | def run(self, concurrency=0, *args, **kwargs):
    method pre_run (line 201) | def pre_run(self, *args, **kwargs):
    method post_run (line 204) | def post_run(self, *args, **kwargs):

FILE: stacker/actions/graph.py
  function each_step (line 11) | def each_step(graph):
  function dot_format (line 24) | def dot_format(out, graph, name="digraph"):
  function json_format (line 35) | def json_format(out, graph):
  class Action (line 52) | class Action(BaseAction):
    method _generate_plan (line 54) | def _generate_plan(self):
    method run (line 60) | def run(self, format=None, reduce=False, *args, **kwargs):

FILE: stacker/actions/info.py
  class Action (line 9) | class Action(BaseAction):
    method run (line 16) | def run(self, *args, **kwargs):

FILE: stacker/awscli_yamlhelper.py
  function intrinsics_multi_constructor (line 20) | def intrinsics_multi_constructor(loader, tag_prefix, node):
  function yaml_dump (line 57) | def yaml_dump(dict_to_dump):
  function yaml_parse (line 66) | def yaml_parse(yamlstr):

FILE: stacker/blueprints/base.py
  class CFNParameter (line 45) | class CFNParameter(object):
    method __init__ (line 47) | def __init__(self, name, value):
    method __repr__ (line 81) | def __repr__(self):
    method to_parameter_value (line 84) | def to_parameter_value(self):
    method ref (line 89) | def ref(self):
  function build_parameter (line 93) | def build_parameter(name, properties):
  function validate_variable_type (line 112) | def validate_variable_type(var_name, var_type, value):
  function validate_allowed_values (line 148) | def validate_allowed_values(allowed_values, value):
  function resolve_variable (line 168) | def resolve_variable(var_name, var_def, provided_variable, blueprint_name):
  function parse_user_data (line 234) | def parse_user_data(variables, raw_user_data, blueprint_name):
  class Blueprint (line 285) | class Blueprint(object):
    method __init__ (line 298) | def __init__(self, name, context, mappings=None, description=None):
    method get_parameter_definitions (line 316) | def get_parameter_definitions(self):
    method get_output_definitions (line 337) | def get_output_definitions(self):
    method get_required_parameter_definitions (line 349) | def get_required_parameter_definitions(self):
    method get_parameter_values (line 364) | def get_parameter_values(self):
    method setup_parameters (line 383) | def setup_parameters(self):
    method defined_variables (line 396) | def defined_variables(self):
    method get_variables (line 408) | def get_variables(self):
    method get_cfn_parameters (line 425) | def get_cfn_parameters(self):
    method resolve_variables (line 440) | def resolve_variables(self, provided_variables):
    method import_mappings (line 463) | def import_mappings(self):
    method reset_template (line 471) | def reset_template(self):
    method render_template (line 476) | def render_template(self):
    method to_json (line 487) | def to_json(self, variables=None):
    method read_user_data (line 512) | def read_user_data(self, user_data_path):
    method set_template_description (line 529) | def set_template_description(self, description):
    method add_output (line 539) | def add_output(self, name, value):
    method requires_change_set (line 549) | def requires_change_set(self):
    method rendered (line 554) | def rendered(self):
    method version (line 560) | def version(self):
    method create_template (line 565) | def create_template(self):

FILE: stacker/blueprints/raw.py
  function get_template_path (line 15) | def get_template_path(filename):
  function get_template_params (line 38) | def get_template_params(template):
  function resolve_variable (line 55) | def resolve_variable(provided_variable, blueprint_name):
  class RawTemplateBlueprint (line 85) | class RawTemplateBlueprint(Blueprint):
    method __init__ (line 88) | def __init__(self, name, context, raw_template_path, mappings=None, # ...
    method to_json (line 99) | def to_json(self, variables=None):  # pylint: disable=unused-argument
    method to_dict (line 113) | def to_dict(self):
    method render_template (line 122) | def render_template(self):
    method get_parameter_definitions (line 126) | def get_parameter_definitions(self):
    method get_output_definitions (line 137) | def get_output_definitions(self):
    method resolve_variables (line 148) | def resolve_variables(self, provided_variables):
    method get_parameter_values (line 185) | def get_parameter_values(self):
    method requires_change_set (line 197) | def requires_change_set(self):
    method rendered (line 202) | def rendered(self):
    method version (line 226) | def version(self):

FILE: stacker/blueprints/testutil.py
  function diff (line 13) | def diff(a, b):
  class BlueprintTestCase (line 25) | class BlueprintTestCase(unittest.TestCase):
    method assertRenderedBlueprint (line 28) | def assertRenderedBlueprint(self, blueprint):  # noqa: N802
  class YamlDirTestGenerator (line 45) | class YamlDirTestGenerator(object):
    method __init__ (line 91) | def __init__(self):
    method base_class (line 99) | def base_class(self):
    method yaml_dirs (line 103) | def yaml_dirs(self):
    method yaml_filename (line 107) | def yaml_filename(self):
    method test_generator (line 110) | def test_generator(self):

FILE: stacker/blueprints/variables/types.py
  class TroposphereType (line 3) | class TroposphereType(object):
    method __init__ (line 5) | def __init__(self, defined_type, many=False, optional=False,
    method _validate_type (line 47) | def _validate_type(self, defined_type):
    method resource_name (line 52) | def resource_name(self):
    method create (line 57) | def create(self, value):
  class CFNType (line 109) | class CFNType(object):
    method __init__ (line 111) | def __init__(self, parameter_type):

FILE: stacker/commands/stacker/__init__.py
  class Stacker (line 18) | class Stacker(BaseCommand):
    method configure (line 23) | def configure(self, options, **kwargs):
    method add_arguments (line 55) | def add_arguments(self, parser):

FILE: stacker/commands/stacker/base.py
  function cancel (line 22) | def cancel():
  class KeyValueAction (line 39) | class KeyValueAction(argparse.Action):
    method __init__ (line 40) | def __init__(self, option_strings, dest, default=None, nargs=None,
    method __call__ (line 48) | def __call__(self, parser, namespace, values, option_string=None):
  function key_value_arg (line 56) | def key_value_arg(string):
  function environment_file (line 65) | def environment_file(input_file):
  class BaseCommand (line 77) | class BaseCommand(object):
    method __init__ (line 98) | def __init__(self, setup_logging=None, *args, **kwargs):
    method add_subcommands (line 103) | def add_subcommands(self, parser):
    method parse_args (line 117) | def parse_args(self, *vargs):
    method run (line 125) | def run(self, options, **kwargs):
    method configure (line 128) | def configure(self, options, **kwargs):
    method get_context_kwargs (line 132) | def get_context_kwargs(self, options, **kwargs):
    method add_arguments (line 149) | def add_arguments(self, parser):

FILE: stacker/commands/stacker/build.py
  class Build (line 13) | class Build(BaseCommand):
    method add_arguments (line 18) | def add_arguments(self, parser):
    method run (line 47) | def run(self, options, **kwargs):
    method get_context_kwargs (line 57) | def get_context_kwargs(self, options, **kwargs):

FILE: stacker/commands/stacker/destroy.py
  class Destroy (line 12) | class Destroy(BaseCommand):
    method add_arguments (line 17) | def add_arguments(self, parser):
    method run (line 38) | def run(self, options, **kwargs):
    method get_context_kwargs (line 47) | def get_context_kwargs(self, options, **kwargs):

FILE: stacker/commands/stacker/diff.py
  class Diff (line 11) | class Diff(BaseCommand):
    method add_arguments (line 15) | def add_arguments(self, parser):
    method run (line 29) | def run(self, options, **kwargs):
    method get_context_kwargs (line 35) | def get_context_kwargs(self, options, **kwargs):

FILE: stacker/commands/stacker/graph.py
  class Graph (line 9) | class Graph(BaseCommand):
    method add_arguments (line 14) | def add_arguments(self, parser):
    method run (line 26) | def run(self, options, **kwargs):

FILE: stacker/commands/stacker/info.py
  class Info (line 7) | class Info(BaseCommand):
    method add_arguments (line 12) | def add_arguments(self, parser):
    method run (line 21) | def run(self, options, **kwargs):
    method get_context_kwargs (line 28) | def get_context_kwargs(self, options, **kwargs):

FILE: stacker/config/__init__.py
  function render_parse_load (line 39) | def render_parse_load(raw_config, environment=None, validate=True):
  function render (line 78) | def render(raw_config, environment=None):
  function substitute_references (line 151) | def substitute_references(root, environment, exp, full_exp):
  function parse (line 217) | def parse(raw_config):
  function load (line 254) | def load(config):
  function dump (line 277) | def dump(config):
  function process_remote_sources (line 296) | def process_remote_sources(raw_config, environment=None):
  function not_empty_list (line 330) | def not_empty_list(value):
  class AnyType (line 336) | class AnyType(BaseType):
  class LocalPackageSource (line 340) | class LocalPackageSource(Model):
  class GitPackageSource (line 348) | class GitPackageSource(Model):
  class S3PackageSource (line 362) | class S3PackageSource(Model):
  class PackageSources (line 376) | class PackageSources(Model):
  class Hook (line 384) | class Hook(Model):
  class Target (line 396) | class Target(Model):
  class Stack (line 404) | class Stack(Model):
    method validate_class_path (line 442) | def validate_class_path(self, data, value):
    method validate_template_path (line 449) | def validate_template_path(self, data, value):
    method validate_stack_source (line 456) | def validate_stack_source(self, data):
    method validate_parameters (line 466) | def validate_parameters(self, data, value):
  class Config (line 479) | class Config(Model):
    method _remove_excess_keys (line 543) | def _remove_excess_keys(self, data):
    method _convert (line 557) | def _convert(self, raw_data=None, context=None, **kwargs):
    method validate (line 568) | def validate(self, *args, **kwargs):
    method validate_stacks (line 576) | def validate_stacks(self, data, value):

FILE: stacker/config/translators/kms.py
  function kms_simple_constructor (line 5) | def kms_simple_constructor(loader, node):

FILE: stacker/context.py
  function get_fqn (line 15) | def get_fqn(base_fqn, delimiter, name=None):
  class Context (line 28) | class Context(object):
    method __init__ (line 46) | def __init__(self, environment=None,
    method namespace (line 57) | def namespace(self):
    method namespace_delimiter (line 61) | def namespace_delimiter(self):
    method template_indent (line 68) | def template_indent(self):
    method bucket_name (line 75) | def bucket_name(self):
    method upload_templates_to_s3 (line 83) | def upload_templates_to_s3(self):
    method tags (line 104) | def tags(self):
    method _base_fqn (line 113) | def _base_fqn(self):
    method mappings (line 117) | def mappings(self):
    method _get_stack_definitions (line 120) | def _get_stack_definitions(self):
    method get_targets (line 123) | def get_targets(self):
    method get_stacks (line 138) | def get_stacks(self):
    method get_stack (line 166) | def get_stack(self, name):
    method get_stacks_dict (line 171) | def get_stacks_dict(self):
    method get_fqn (line 174) | def get_fqn(self, name=None):
    method set_hook_data (line 183) | def set_hook_data(self, key, data):

FILE: stacker/dag/__init__.py
  class DAGValidationError (line 10) | class DAGValidationError(Exception):
  class DAG (line 14) | class DAG(object):
    method __init__ (line 17) | def __init__(self):
    method add_node (line 21) | def add_node(self, node_name):
    method add_node_if_not_exists (line 36) | def add_node_if_not_exists(self, node_name):
    method delete_node (line 47) | def delete_node(self, node_name):
    method delete_node_if_exists (line 65) | def delete_node_if_exists(self, node_name):
    method add_edge (line 80) | def add_edge(self, ind_node, dep_node):
    method delete_edge (line 107) | def delete_edge(self, ind_node, dep_node):
    method transpose (line 125) | def transpose(self):
    method walk (line 141) | def walk(self, walk_func):
    method transitive_reduction (line 158) | def transitive_reduction(self):
    method rename_edges (line 187) | def rename_edges(self, old_node_name, new_node_name):
    method predecessors (line 205) | def predecessors(self, node):
    method downstream (line 217) | def downstream(self, node):
    method all_downstreams (line 232) | def all_downstreams(self, node):
    method filter (line 257) | def filter(self, nodes):
    method all_leaves (line 283) | def all_leaves(self):
    method from_dict (line 292) | def from_dict(self, graph_dict):
    method reset_graph (line 313) | def reset_graph(self):
    method ind_nodes (line 317) | def ind_nodes(self):
    method validate (line 330) | def validate(self):
    method topological_sort (line 340) | def topological_sort(self):
    method size (line 378) | def size(self):
    method __len__ (line 381) | def __len__(self):
  function walk (line 385) | def walk(dag, walk_func):
  class UnlimitedSemaphore (line 389) | class UnlimitedSemaphore(object):
    method acquire (line 394) | def acquire(self, *args):
    method release (line 397) | def release(self):
  class ThreadedWalker (line 401) | class ThreadedWalker(object):
    method __init__ (line 410) | def __init__(self, semaphore):
    method walk (line 413) | def walk(self, dag, walk_func):

FILE: stacker/environment.py
  class DictWithSourceType (line 5) | class DictWithSourceType(dict):
    method __init__ (line 14) | def __init__(self, source_type, *args):
  function parse_environment (line 21) | def parse_environment(raw_environment):
  function parse_yaml_environment (line 40) | def parse_yaml_environment(raw_environment):

FILE: stacker/exceptions.py
  class InvalidConfig (line 3) | class InvalidConfig(Exception):
    method __init__ (line 4) | def __init__(self, errors):
  class InvalidLookupCombination (line 9) | class InvalidLookupCombination(Exception):
    method __init__ (line 11) | def __init__(self, lookup, lookups, value, *args, **kwargs):
  class InvalidLookupConcatenation (line 21) | class InvalidLookupConcatenation(Exception):
    method __init__ (line 26) | def __init__(self, lookup, lookups, *args, **kwargs):
  class UnknownLookupType (line 32) | class UnknownLookupType(Exception):
    method __init__ (line 34) | def __init__(self, lookup_type, *args, **kwargs):
  class FailedVariableLookup (line 39) | class FailedVariableLookup(Exception):
    method __init__ (line 41) | def __init__(self, variable_name, lookup, error, *args, **kwargs):
  class FailedLookup (line 50) | class FailedLookup(Exception):
    method __init__ (line 55) | def __init__(self, lookup, error, *args, **kwargs):
  class InvalidUserdataPlaceholder (line 61) | class InvalidUserdataPlaceholder(Exception):
    method __init__ (line 63) | def __init__(self, blueprint_name, exception_message, *args, **kwargs):
  class UnresolvedVariables (line 72) | class UnresolvedVariables(Exception):
    method __init__ (line 74) | def __init__(self, blueprint_name, *args, **kwargs):
  class UnresolvedVariable (line 80) | class UnresolvedVariable(Exception):
    method __init__ (line 82) | def __init__(self, blueprint_name, variable, *args, **kwargs):
  class UnresolvedVariableValue (line 91) | class UnresolvedVariableValue(Exception):
    method __init__ (line 96) | def __init__(self, lookup, *args, **kwargs):
  class MissingVariable (line 102) | class MissingVariable(Exception):
    method __init__ (line 104) | def __init__(self, blueprint_name, variable_name, *args, **kwargs):
  class VariableTypeRequired (line 110) | class VariableTypeRequired(Exception):
    method __init__ (line 112) | def __init__(self, blueprint_name, variable_name, *args, **kwargs):
  class StackDoesNotExist (line 120) | class StackDoesNotExist(Exception):
    method __init__ (line 122) | def __init__(self, stack_name, *args, **kwargs):
  class MissingParameterException (line 128) | class MissingParameterException(Exception):
    method __init__ (line 130) | def __init__(self, parameters, *args, **kwargs):
  class OutputDoesNotExist (line 139) | class OutputDoesNotExist(Exception):
    method __init__ (line 141) | def __init__(self, stack_name, output, *args, **kwargs):
  class MissingEnvironment (line 150) | class MissingEnvironment(Exception):
    method __init__ (line 152) | def __init__(self, key, *args, **kwargs):
  class WrongEnvironmentType (line 158) | class WrongEnvironmentType(Exception):
    method __init__ (line 160) | def __init__(self, key, *args, **kwargs):
  class ImproperlyConfigured (line 166) | class ImproperlyConfigured(Exception):
    method __init__ (line 168) | def __init__(self, cls, error, *args, **kwargs):
  class StackDidNotChange (line 176) | class StackDidNotChange(Exception):
  class CancelExecution (line 183) | class CancelExecution(Exception):
  class ValidatorError (line 188) | class ValidatorError(Exception):
    method __init__ (line 193) | def __init__(self, variable, validator, value, exception=None):
    method __str__ (line 205) | def __str__(self):
  class ChangesetDidNotStabilize (line 209) | class ChangesetDidNotStabilize(Exception):
    method __init__ (line 210) | def __init__(self, change_set_id):
  class UnhandledChangeSetStatus (line 219) | class UnhandledChangeSetStatus(Exception):
    method __init__ (line 220) | def __init__(self, stack_name, change_set_id, status, status_reason):
  class UnableToExecuteChangeSet (line 234) | class UnableToExecuteChangeSet(Exception):
    method __init__ (line 235) | def __init__(self, stack_name, change_set_id, execution_status):
  class StackUpdateBadStatus (line 246) | class StackUpdateBadStatus(Exception):
    method __init__ (line 248) | def __init__(self, stack_name, stack_status, reason, *args, **kwargs):
  class PlanFailed (line 257) | class PlanFailed(Exception):
    method __init__ (line 259) | def __init__(self, failed_steps, *args, **kwargs):
  class GraphError (line 268) | class GraphError(Exception):
    method __init__ (line 272) | def __init__(self, exception, stack, dependency):

FILE: stacker/hooks/aws_lambda.py
  function _zip_files (line 28) | def _zip_files(files, root):
  function _calculate_hash (line 73) | def _calculate_hash(files, root):
  function _calculate_prebuilt_hash (line 98) | def _calculate_prebuilt_hash(f):
  function _find_files (line 110) | def _find_files(root, includes, excludes, follow_symlinks):
  function _zip_from_file_patterns (line 144) | def _zip_from_file_patterns(root, includes, excludes, follow_symlinks):
  function _head_object (line 179) | def _head_object(s3_conn, bucket, key):
  function _upload_code (line 204) | def _upload_code(s3_conn, bucket, prefix, name, contents, content_hash,
  function _check_pattern_list (line 247) | def _check_pattern_list(patterns, key, default=None):
  function _upload_prebuilt_zip (line 282) | def _upload_prebuilt_zip(s3_conn, bucket, prefix, name, options, path,
  function _build_and_upload_zip (line 297) | def _build_and_upload_zip(s3_conn, bucket, prefix, name, options, path,
  function _upload_function (line 314) | def _upload_function(s3_conn, bucket, prefix, name, options, follow_syml...
  function select_bucket_region (line 374) | def select_bucket_region(custom_bucket, hook_region, stacker_bucket_region,
  function upload_lambda_functions (line 400) | def upload_lambda_functions(context, provider, **kwargs):

FILE: stacker/hooks/command.py
  function _devnull (line 11) | def _devnull():
  function run_command (line 15) | def run_command(provider, context, command, capture=False, interactive=F...

FILE: stacker/hooks/ecs.py
  function create_clusters (line 12) | def create_clusters(provider, context, **kwargs):

FILE: stacker/hooks/iam.py
  function create_ecs_service_role (line 16) | def create_ecs_service_role(provider, context, **kwargs):
  function _get_cert_arn_from_response (line 63) | def _get_cert_arn_from_response(response):
  function get_cert_contents (line 71) | def get_cert_contents(kwargs):
  function ensure_server_cert_exists (line 124) | def ensure_server_cert_exists(provider, context, **kwargs):

FILE: stacker/hooks/keypair.py
  function get_existing_key_pair (line 18) | def get_existing_key_pair(ec2, keypair_name):
  function import_key_pair (line 38) | def import_key_pair(ec2, keypair_name, public_key_data):
  function read_public_key_file (line 50) | def read_public_key_file(path):
  function create_key_pair_from_public_key_file (line 67) | def create_key_pair_from_public_key_file(ec2, keypair_name, public_key_p...
  function create_key_pair_in_ssm (line 80) | def create_key_pair_in_ssm(ec2, ssm, keypair_name, parameter_name,
  function create_key_pair (line 117) | def create_key_pair(ec2, keypair_name):
  function create_key_pair_local (line 126) | def create_key_pair_local(ec2, keypair_name, dest_dir):
  function interactive_prompt (line 153) | def interactive_prompt(keypair_name, ):
  function ensure_keypair_exists (line 181) | def ensure_keypair_exists(provider, context, **kwargs):

FILE: stacker/hooks/route53.py
  function create_domain (line 10) | def create_domain(provider, context, **kwargs):

FILE: stacker/hooks/utils.py
  function full_path (line 11) | def full_path(path):
  function handle_hooks (line 15) | def handle_hooks(stage, hooks, provider, context):

FILE: stacker/logger/__init__.py
  class ColorFormatter (line 12) | class ColorFormatter(logging.Formatter):
    method format (line 14) | def format(self, record):
  function setup_logging (line 21) | def setup_logging(verbosity, formats=None):

FILE: stacker/lookups/__init__.py
  function extract_lookups_from_string (line 26) | def extract_lookups_from_string(value):
  function extract_lookups (line 46) | def extract_lookups(value):

FILE: stacker/lookups/handlers/__init__.py
  class LookupHandler (line 3) | class LookupHandler(object):
    method handle (line 5) | def handle(cls, value, context, provider):
    method dependencies (line 19) | def dependencies(cls, lookup_data):

FILE: stacker/lookups/handlers/ami.py
  class ImageNotFound (line 11) | class ImageNotFound(Exception):
    method __init__ (line 12) | def __init__(self, search_string):
  class AmiLookup (line 20) | class AmiLookup(LookupHandler):
    method handle (line 22) | def handle(cls, value, provider, **kwargs):

FILE: stacker/lookups/handlers/default.py
  class DefaultLookup (line 8) | class DefaultLookup(LookupHandler):
    method handle (line 10) | def handle(cls, value, **kwargs):

FILE: stacker/lookups/handlers/dynamodb.py
  class DynamodbLookup (line 11) | class DynamodbLookup(LookupHandler):
    method handle (line 13) | def handle(cls, value, **kwargs):
  function _lookup_key_parse (line 81) | def _lookup_key_parse(table_keys):
  function _build_projection_expression (line 125) | def _build_projection_expression(clean_table_keys):
  function _get_val_from_ddb_data (line 142) | def _get_val_from_ddb_data(data, keylist):
  function _convert_ddb_list_to_list (line 176) | def _convert_ddb_list_to_list(conversion_list):

FILE: stacker/lookups/handlers/envvar.py
  class EnvvarLookup (line 9) | class EnvvarLookup(LookupHandler):
    method handle (line 11) | def handle(cls, value, **kwargs):

FILE: stacker/lookups/handlers/file.py
  class FileLookup (line 20) | class FileLookup(LookupHandler):
    method handle (line 22) | def handle(cls, value, **kwargs):
  function _parameterize_string (line 112) | def _parameterize_string(raw):
  function parameterized_codec (line 142) | def parameterized_codec(raw, b64):
  function _parameterize_obj (line 167) | def _parameterize_obj(obj):
  class SafeUnicodeLoader (line 197) | class SafeUnicodeLoader(yaml.SafeLoader):
    method construct_yaml_str (line 198) | def construct_yaml_str(self, node):
  function yaml_codec (line 202) | def yaml_codec(raw, parameterized=False):
  function json_codec (line 207) | def json_codec(raw, parameterized=False):

FILE: stacker/lookups/handlers/hook_data.py
  class HookDataLookup (line 8) | class HookDataLookup(LookupHandler):
    method handle (line 10) | def handle(cls, value, context, **kwargs):

FILE: stacker/lookups/handlers/kms.py
  class KmsLookup (line 11) | class KmsLookup(LookupHandler):
    method handle (line 13) | def handle(cls, value, **kwargs):

FILE: stacker/lookups/handlers/output.py
  class OutputLookup (line 12) | class OutputLookup(LookupHandler):
    method handle (line 14) | def handle(cls, value, context=None, **kwargs):
    method dependencies (line 35) | def dependencies(cls, lookup_data):
  function deconstruct (line 57) | def deconstruct(value):

FILE: stacker/lookups/handlers/rxref.py
  class RxrefLookup (line 20) | class RxrefLookup(LookupHandler):
    method handle (line 22) | def handle(cls, value, provider=None, context=None, **kwargs):

FILE: stacker/lookups/handlers/split.py
  class SplitLookup (line 5) | class SplitLookup(LookupHandler):
    method handle (line 7) | def handle(cls, value, **kwargs):

FILE: stacker/lookups/handlers/ssmstore.py
  class SsmstoreLookup (line 10) | class SsmstoreLookup(LookupHandler):
    method handle (line 12) | def handle(cls, value, **kwargs):

FILE: stacker/lookups/handlers/xref.py
  class XrefLookup (line 19) | class XrefLookup(LookupHandler):
    method handle (line 21) | def handle(cls, value, provider=None, **kwargs):

FILE: stacker/lookups/registry.py
  function register_lookup_handler (line 26) | def register_lookup_handler(lookup_type, handler_or_path):
  function unregister_lookup_handler (line 53) | def unregister_lookup_handler(lookup_type):
  function resolve_lookups (line 66) | def resolve_lookups(variable, context, provider):

FILE: stacker/plan.py
  function log_step (line 32) | def log_step(step):
  class Step (line 40) | class Step(object):
    method __init__ (line 51) | def __init__(self, stack, fn, watch_func=None):
    method __repr__ (line 58) | def __repr__(self):
    method __str__ (line 61) | def __str__(self):
    method run (line 64) | def run(self):
    method _run_once (line 87) | def _run_once(self):
    method name (line 97) | def name(self):
    method requires (line 101) | def requires(self):
    method required_by (line 105) | def required_by(self):
    method completed (line 109) | def completed(self):
    method skipped (line 114) | def skipped(self):
    method failed (line 119) | def failed(self):
    method done (line 124) | def done(self):
    method ok (line 130) | def ok(self):
    method submitted (line 135) | def submitted(self):
    method set_status (line 139) | def set_status(self, status):
    method complete (line 153) | def complete(self):
    method skip (line 157) | def skip(self):
    method submit (line 161) | def submit(self):
  function build_plan (line 166) | def build_plan(description, graph,
  function build_graph (line 198) | def build_graph(steps):
  class Graph (line 219) | class Graph(object):
    method __init__ (line 242) | def __init__(self, steps=None, dag=None):
    method add_step (line 246) | def add_step(self, step):
    method connect (line 250) | def connect(self, step, dep):
    method transitive_reduction (line 258) | def transitive_reduction(self):
    method walk (line 261) | def walk(self, walker, walk_func):
    method downstream (line 268) | def downstream(self, step_name):
    method transposed (line 272) | def transposed(self):
    method filtered (line 278) | def filtered(self, step_names):
    method topological_sort (line 282) | def topological_sort(self):
    method to_dict (line 286) | def to_dict(self):
  class Plan (line 290) | class Plan(object):
    method __init__ (line 297) | def __init__(self, description, graph):
    method outline (line 302) | def outline(self, level=logging.INFO, message=""):
    method dump (line 327) | def dump(self, directory, context, provider=None):
    method execute (line 354) | def execute(self, *args, **kwargs):
    method walk (line 367) | def walk(self, walker):
    method steps (line 389) | def steps(self):
    method step_names (line 395) | def step_names(self):
    method keys (line 398) | def keys(self):

FILE: stacker/providers/aws/default.py
  function get_cloudformation_client (line 57) | def get_cloudformation_client(session):
  function get_output_dict (line 66) | def get_output_dict(stack):
  function s3_fallback (line 88) | def s3_fallback(fqn, template, parameters, tags, method,
  function get_change_set_name (line 115) | def get_change_set_name():
  function requires_replacement (line 127) | def requires_replacement(changeset):
  function output_full_changeset (line 141) | def output_full_changeset(full_changeset=None, params_diff=None,
  function ask_for_approval (line 182) | def ask_for_approval(full_changeset=None, params_diff=None,
  function output_summary (line 212) | def output_summary(fqn, action, changeset, params_diff,
  function format_params_diff (line 256) | def format_params_diff(params_diff):
  function summarize_params_diff (line 263) | def summarize_params_diff(params_diff):
  function wait_till_change_set_complete (line 284) | def wait_till_change_set_complete(cfn_client, change_set_id, try_count=25,
  function create_change_set (line 330) | def create_change_set(
  function check_tags_contain (line 399) | def check_tags_contain(actual, expected):
  function generate_cloudformation_args (line 419) | def generate_cloudformation_args(
  function generate_stack_policy_args (line 493) | def generate_stack_policy_args(stack_policy=None):
  class ProviderBuilder (line 520) | class ProviderBuilder(object):
    method __init__ (line 523) | def __init__(self, region=None, **kwargs):
    method build (line 529) | def build(self, region=None, profile=None):
  class Provider (line 554) | class Provider(BaseProvider):
    method __init__ (line 604) | def __init__(self, session, region=None, interactive=False,
    method get_stack (line 616) | def get_stack(self, stack_name, **kwargs):
    method get_stack_status (line 625) | def get_stack_status(self, stack, **kwargs):
    method is_stack_completed (line 628) | def is_stack_completed(self, stack, **kwargs):
    method is_stack_in_progress (line 631) | def is_stack_in_progress(self, stack, **kwargs):
    method is_stack_destroyed (line 634) | def is_stack_destroyed(self, stack, **kwargs):
    method is_stack_recreatable (line 637) | def is_stack_recreatable(self, stack, **kwargs):
    method is_stack_rolling_back (line 640) | def is_stack_rolling_back(self, stack, **kwargs):
    method is_stack_failed (line 643) | def is_stack_failed(self, stack, **kwargs):
    method is_stack_in_review (line 646) | def is_stack_in_review(self, stack, **kwargs):
    method tail_stack (line 649) | def tail_stack(self, stack, cancel, log_func=None, **kwargs):
    method _tail_print (line 680) | def _tail_print(e):
    method get_events (line 685) | def get_events(self, stack_name, chronological=True):
    method get_rollback_status_reason (line 708) | def get_rollback_status_reason(self, stack_name):
    method tail (line 723) | def tail(self, stack_name, cancel, log_func=_tail_print, sleep_time=5,
    method destroy_stack (line 745) | def destroy_stack(self, stack, **kwargs):
    method create_stack (line 754) | def create_stack(
    method select_update_method (line 818) | def select_update_method(self, force_interactive, force_change_set):
    method prepare_stack_for_update (line 836) | def prepare_stack_for_update(self, stack, tags):
    method update_stack (line 903) | def update_stack(self, fqn, template, old_parameters, parameters, tags,
    method deal_with_changeset_stack_policy (line 940) | def deal_with_changeset_stack_policy(self, fqn, stack_policy):
    method interactive_update_stack (line 957) | def interactive_update_stack(self, fqn, template, old_parameters,
    method noninteractive_changeset_update (line 1017) | def noninteractive_changeset_update(self, fqn, template, old_parameters,
    method default_update_stack (line 1051) | def default_update_stack(self, fqn, template, old_parameters, parameters,
    method get_stack_name (line 1097) | def get_stack_name(self, stack, **kwargs):
    method get_stack_tags (line 1100) | def get_stack_tags(self, stack, **kwargs):
    method get_outputs (line 1103) | def get_outputs(self, stack_name, *args, **kwargs):
    method get_output_dict (line 1109) | def get_output_dict(self, stack):
    method get_stack_info (line 1112) | def get_stack_info(self, stack):
    method get_stack_changes (line 1134) | def get_stack_changes(self, stack, template, parameters,
    method params_as_dict (line 1252) | def params_as_dict(parameters_list):

FILE: stacker/providers/base.py
  function not_implemented (line 3) | def not_implemented(method):
  class BaseProviderBuilder (line 8) | class BaseProviderBuilder(object):
    method build (line 9) | def build(self, region=None):
  class BaseProvider (line 13) | class BaseProvider(object):
    method get_stack (line 14) | def get_stack(self, stack_name, *args, **kwargs):
    method create_stack (line 18) | def create_stack(self, *args, **kwargs):
    method update_stack (line 22) | def update_stack(self, *args, **kwargs):
    method destroy_stack (line 26) | def destroy_stack(self, *args, **kwargs):
    method get_stack_status (line 30) | def get_stack_status(self, stack_name, *args, **kwargs):
    method get_outputs (line 34) | def get_outputs(self, stack_name, *args, **kwargs):
    method get_output (line 38) | def get_output(self, stack_name, output):
  class Template (line 43) | class Template(object):
    method __init__ (line 51) | def __init__(self, url=None, body=None):

FILE: stacker/session_cache.py
  function get_session (line 17) | def get_session(region, profile=None):

FILE: stacker/stack.py
  function _gather_variables (line 12) | def _gather_variables(stack_def):
  class Stack (line 41) | class Stack(object):
    method __init__ (line 60) | def __init__(
    method __repr__ (line 87) | def __repr__(self):
    method required_by (line 91) | def required_by(self):
    method requires (line 95) | def requires(self):
    method stack_policy (line 110) | def stack_policy(self):
    method blueprint (line 120) | def blueprint(self):
    method tags (line 148) | def tags(self):
    method parameter_values (line 161) | def parameter_values(self):
    method all_parameter_definitions (line 174) | def all_parameter_definitions(self):
    method required_parameter_definitions (line 179) | def required_parameter_definitions(self):
    method resolve (line 183) | def resolve(self, context, provider):
    method set_outputs (line 198) | def set_outputs(self, outputs):

FILE: stacker/status.py
  class Status (line 4) | class Status(object):
    method __init__ (line 5) | def __init__(self, name, code, reason=None):
    method _comparison (line 10) | def _comparison(self, operator, other):
    method __eq__ (line 15) | def __eq__(self, other):
    method __ne__ (line 18) | def __ne__(self, other):
    method __lt__ (line 21) | def __lt__(self, other):
    method __gt__ (line 24) | def __gt__(self, other):
    method __le__ (line 27) | def __le__(self, other):
    method __ge__ (line 30) | def __ge__(self, other):
  class PendingStatus (line 34) | class PendingStatus(Status):
    method __init__ (line 35) | def __init__(self, reason=None):
  class SubmittedStatus (line 39) | class SubmittedStatus(Status):
    method __init__ (line 40) | def __init__(self, reason=None):
  class CompleteStatus (line 44) | class CompleteStatus(Status):
    method __init__ (line 45) | def __init__(self, reason=None):
  class SkippedStatus (line 49) | class SkippedStatus(Status):
    method __init__ (line 50) | def __init__(self, reason=None):
  class FailedStatus (line 54) | class FailedStatus(Status):
    method __init__ (line 55) | def __init__(self, reason=None):
  class NotSubmittedStatus (line 59) | class NotSubmittedStatus(SkippedStatus):
  class NotUpdatedStatus (line 63) | class NotUpdatedStatus(SkippedStatus):
  class DidNotChangeStatus (line 67) | class DidNotChangeStatus(SkippedStatus):
  class StackDoesNotExist (line 71) | class StackDoesNotExist(SkippedStatus):

FILE: stacker/target.py
  class Target (line 3) | class Target(object):
    method __init__ (line 9) | def __init__(self, definition):

FILE: stacker/tests/actions/test_base.py
  class TestBlueprint (line 24) | class TestBlueprint(Blueprint):
    method version (line 26) | def version(self):
  class TestBaseAction (line 34) | class TestBaseAction(unittest.TestCase):
    method test_ensure_cfn_bucket_exists (line 35) | def test_ensure_cfn_bucket_exists(self):
    method test_ensure_cfn_bucket_doesnt_exist_us_east (line 53) | def test_ensure_cfn_bucket_doesnt_exist_us_east(self):
    method test_ensure_cfn_bucket_doesnt_exist_us_west (line 77) | def test_ensure_cfn_bucket_doesnt_exist_us_west(self):
    method test_ensure_cfn_forbidden (line 104) | def test_ensure_cfn_forbidden(self):
    method test_stack_template_url (line 122) | def test_stack_template_url(self):

FILE: stacker/tests/actions/test_build.py
  function mock_stack_parameters (line 31) | def mock_stack_parameters(parameters):
  class TestProvider (line 40) | class TestProvider(BaseProvider):
    method __init__ (line 41) | def __init__(self, outputs=None, *args, **kwargs):
    method set_outputs (line 44) | def set_outputs(self, outputs):
    method get_stack (line 47) | def get_stack(self, stack_name, **kwargs):
    method get_outputs (line 52) | def get_outputs(self, stack_name, *args, **kwargs):
  class TestBuildAction (line 57) | class TestBuildAction(unittest.TestCase):
    method setUp (line 58) | def setUp(self):
    method _get_context (line 65) | def _get_context(self, **kwargs):
    method test_handle_missing_params (line 82) | def test_handle_missing_params(self):
    method test_missing_params_no_existing_stack (line 99) | def test_missing_params_no_existing_stack(self):
    method test_existing_stack_params_dont_override_given_params (line 108) | def test_existing_stack_params_dont_override_given_params(self):
    method test_generate_plan (line 126) | def test_generate_plan(self):
    method test_dont_execute_plan_when_outline_specified (line 139) | def test_dont_execute_plan_when_outline_specified(self):
    method test_execute_plan_when_outline_not_specified (line 147) | def test_execute_plan_when_outline_not_specified(self):
    method test_should_update (line 155) | def test_should_update(self):
    method test_should_ensure_cfn_bucket (line 171) | def test_should_ensure_cfn_bucket(self):
    method test_should_submit (line 191) | def test_should_submit(self):
  class TestLaunchStack (line 206) | class TestLaunchStack(TestBuildAction):
    method setUp (line 207) | def setUp(self):
    method _advance (line 255) | def _advance(self, new_provider_status, expected_status, expected_reas...
    method test_launch_stack_disabled (line 261) | def test_launch_stack_disabled(self):
    method test_launch_stack_create (line 267) | def test_launch_stack_create(self):
    method test_launch_stack_create_rollback (line 280) | def test_launch_stack_create_rollback(self):
    method test_launch_stack_recreate (line 303) | def test_launch_stack_recreate(self):
    method test_launch_stack_update_skipped (line 329) | def test_launch_stack_update_skipped(self):
    method test_launch_stack_update_rollback (line 338) | def test_launch_stack_update_rollback(self):
    method test_launch_stack_update_success (line 358) | def test_launch_stack_update_success(self):
  class TestFunctions (line 375) | class TestFunctions(unittest.TestCase):
    method setUp (line 378) | def setUp(self):
    method test_resolve_parameters_unused_parameter (line 383) | def test_resolve_parameters_unused_parameter(self):
    method test_resolve_parameters_none_conversion (line 397) | def test_resolve_parameters_none_conversion(self):
    method test_resolve_parameters_booleans (line 410) | def test_resolve_parameters_booleans(self):

FILE: stacker/tests/actions/test_destroy.py
  class MockStack (line 18) | class MockStack(object):
    method __init__ (line 21) | def __init__(self, name, tags=None, **kwargs):
  class TestDestroyAction (line 29) | class TestDestroyAction(unittest.TestCase):
    method setUp (line 31) | def setUp(self):
    method test_generate_plan (line 46) | def test_generate_plan(self):
    method test_only_execute_plan_when_forced (line 62) | def test_only_execute_plan_when_forced(self):
    method test_execute_plan_when_forced (line 68) | def test_execute_plan_when_forced(self):
    method test_destroy_stack_complete_if_state_submitted (line 74) | def test_destroy_stack_complete_if_state_submitted(self):
    method test_destroy_stack_step_statuses (line 89) | def test_destroy_stack_step_statuses(self):

FILE: stacker/tests/actions/test_diff.py
  class TestDictValueFormat (line 11) | class TestDictValueFormat(unittest.TestCase):
    method test_status (line 12) | def test_status(self):
    method test_format (line 22) | def test_format(self):
  class TestDiffDictionary (line 41) | class TestDiffDictionary(unittest.TestCase):
    method test_diff_dictionaries (line 42) | def test_diff_dictionaries(self):
  class TestDiffParameters (line 73) | class TestDiffParameters(unittest.TestCase):
    method test_diff_parameters_no_changes (line 74) | def test_diff_parameters_no_changes(self):

FILE: stacker/tests/blueprints/test_base.py
  function mock_lookup_handler (line 44) | def mock_lookup_handler(value, provider=None, context=None, fqn=False,
  class TestBuildParameter (line 52) | class TestBuildParameter(unittest.TestCase):
    method test_base_parameter (line 54) | def test_base_parameter(self):
  class TestBlueprintRendering (line 60) | class TestBlueprintRendering(unittest.TestCase):
    method test_to_json (line 62) | def test_to_json(self):
  class TestBaseBlueprint (line 99) | class TestBaseBlueprint(unittest.TestCase):
    method test_add_output (line 100) | def test_add_output(self):
  class TestVariables (line 118) | class TestVariables(unittest.TestCase):
    method test_defined_variables (line 120) | def test_defined_variables(self):
    method test_defined_variables_subclass (line 132) | def test_defined_variables_subclass(self):
    method test_get_variables_unresolved_variables (line 153) | def test_get_variables_unresolved_variables(self):
    method test_set_description (line 161) | def test_set_description(self):
    method test_validate_variable_type_cfntype (line 177) | def test_validate_variable_type_cfntype(self):
    method test_validate_variable_type_cfntype_none_value (line 184) | def test_validate_variable_type_cfntype_none_value(self):
    method test_validate_variable_type_matching_type (line 191) | def test_validate_variable_type_matching_type(self):
    method test_strict_validate_variable_type (line 201) | def test_strict_validate_variable_type(self):
    method test_validate_variable_type_invalid_value (line 208) | def test_validate_variable_type_invalid_value(self):
    method test_resolve_variable_no_type_on_variable_definition (line 215) | def test_resolve_variable_no_type_on_variable_definition(self):
    method test_resolve_variable_no_provided_with_default (line 225) | def test_resolve_variable_no_provided_with_default(self):
    method test_resolve_variable_no_provided_without_default (line 237) | def test_resolve_variable_no_provided_without_default(self):
    method test_resolve_variable_provided_not_resolved (line 247) | def test_resolve_variable_provided_not_resolved(self):
    method _resolve_troposphere_var (line 257) | def _resolve_troposphere_var(self, tpe, value, **kwargs):
    method test_resolve_variable_troposphere_type_resource_single (line 266) | def test_resolve_variable_troposphere_type_resource_single(self):
    method test_resolve_variable_troposphere_type_resource_optional (line 274) | def test_resolve_variable_troposphere_type_resource_optional(self):
    method test_resolve_variable_troposphere_type_value_blank_required (line 278) | def test_resolve_variable_troposphere_type_value_blank_required(self):
    method test_resolve_variable_troposphere_type_resource_many (line 282) | def test_resolve_variable_troposphere_type_resource_many(self):
    method test_resolve_variable_troposphere_type_resource_many_empty (line 294) | def test_resolve_variable_troposphere_type_resource_many_empty(self):
    method test_resolve_variable_troposphere_type_resource_fail (line 298) | def test_resolve_variable_troposphere_type_resource_fail(self):
    method test_resolve_variable_troposphere_type_props_single (line 309) | def test_resolve_variable_troposphere_type_props_single(self):
    method test_resolve_variable_troposphere_type_props_optional (line 318) | def test_resolve_variable_troposphere_type_props_optional(self):
    method test_resolve_variable_troposphere_type_props_many (line 323) | def test_resolve_variable_troposphere_type_props_many(self):
    method test_resolve_variable_troposphere_type_props_many_empty (line 335) | def test_resolve_variable_troposphere_type_props_many_empty(self):
    method test_resolve_variable_troposphere_type_props_fail (line 339) | def test_resolve_variable_troposphere_type_props_fail(self):
    method test_resolve_variable_troposphere_type_unvalidated (line 343) | def test_resolve_variable_troposphere_type_unvalidated(self):
    method test_resolve_variable_troposphere_type_optional_many (line 346) | def test_resolve_variable_troposphere_type_optional_many(self):
    method test_resolve_variable_provided_resolved (line 351) | def test_resolve_variable_provided_resolved(self):
    method test_resolve_variable_allowed_values (line 362) | def test_resolve_variable_allowed_values(self):
    method test_resolve_variable_validator_valid_value (line 376) | def test_resolve_variable_validator_valid_value(self):
    method test_resolve_variable_validator_invalid_value (line 392) | def test_resolve_variable_validator_invalid_value(self):
    method test_resolve_variables (line 411) | def test_resolve_variables(self):
    method test_resolve_variables_lookup_returns_non_string (line 432) | def test_resolve_variables_lookup_returns_non_string(self):
    method test_resolve_variables_lookup_returns_troposphere_obj (line 450) | def test_resolve_variables_lookup_returns_troposphere_obj(self):
    method test_resolve_variables_lookup_returns_non_string_invalid_combo (line 469) | def test_resolve_variables_lookup_returns_non_string_invalid_combo(self):
    method test_get_variables (line 487) | def test_get_variables(self):
    method test_resolve_variables_missing_variable (line 501) | def test_resolve_variables_missing_variable(self):
    method test_resolve_variables_incorrect_type (line 513) | def test_resolve_variables_incorrect_type(self):
    method test_get_variables_default_value (line 524) | def test_get_variables_default_value(self):
    method test_resolve_variables_convert_type (line 538) | def test_resolve_variables_convert_type(self):
    method test_resolve_variables_cfn_type (line 550) | def test_resolve_variables_cfn_type(self):
    method test_resolve_variables_cfn_number (line 562) | def test_resolve_variables_cfn_number(self):
    method test_resolve_variables_cfn_type_list (line 575) | def test_resolve_variables_cfn_type_list(self):
    method test_resolve_variables_cfn_type_list_invalid_value (line 591) | def test_resolve_variables_cfn_type_list_invalid_value(self):
    method test_get_parameter_definitions_cfn_type_list (line 603) | def test_get_parameter_definitions_cfn_type_list(self):
    method test_get_parameter_definitions_cfn_type (line 616) | def test_get_parameter_definitions_cfn_type(self):
    method test_get_required_parameter_definitions_cfn_type (line 628) | def test_get_required_parameter_definitions_cfn_type(self):
    method test_get_parameter_values (line 639) | def test_get_parameter_values(self):
    method test_validate_allowed_values (line 655) | def test_validate_allowed_values(self):
    method test_blueprint_with_parameters_fails (line 662) | def test_blueprint_with_parameters_fails(self):
    method test_variable_exists_but_value_is_none (line 679) | def test_variable_exists_but_value_is_none(self):
  class TestCFNParameter (line 691) | class TestCFNParameter(unittest.TestCase):
    method test_cfnparameter_convert_boolean (line 692) | def test_cfnparameter_convert_boolean(self):
    method test_parse_user_data (line 703) | def test_parse_user_data(self):
    method test_parse_user_data_missing_variable (line 715) | def test_parse_user_data_missing_variable(self):
    method test_parse_user_data_invaled_placeholder (line 725) | def test_parse_user_data_invaled_placeholder(self):
    method test_read_user_data (line 734) | def test_read_user_data(self, parse_mock, file_mock):

FILE: stacker/tests/blueprints/test_raw.py
  function test_get_template_path_local_file (line 19) | def test_get_template_path_local_file(tmpdir):
  function test_get_template_path_invalid_file (line 30) | def test_get_template_path_invalid_file(tmpdir):
  function test_get_template_path_file_in_syspath (line 37) | def test_get_template_path_file_in_syspath(tmpdir, monkeypatch):
  function test_get_template_params (line 52) | def test_get_template_params():
  class TestBlueprintRendering (line 81) | class TestBlueprintRendering(unittest.TestCase):
    method test_to_json (line 84) | def test_to_json(self):
    method test_j2_to_json (line 124) | def test_j2_to_json(self):
  class TestVariables (line 172) | class TestVariables(unittest.TestCase):
    method test_get_parameter_definitions_json (line 175) | def test_get_parameter_definitions_json(self):  # noqa pylint: disable...
    method test_get_parameter_definitions_yaml (line 188) | def test_get_parameter_definitions_yaml(self):  # noqa pylint: disable...
    method test_get_required_parameter_definitions_json (line 202) | def test_get_required_parameter_definitions_json(self):  # noqa pylint...
    method test_get_required_parameter_definitions_yaml (line 213) | def test_get_required_parameter_definitions_yaml(self):  # noqa pylint...

FILE: stacker/tests/blueprints/test_testutil.py
  class Repositories (line 11) | class Repositories(Blueprint):
    method create_template (line 20) | def create_template(self):
  class TestRepositories (line 33) | class TestRepositories(BlueprintTestCase):
    method test_create_template_passes (line 34) | def test_create_template_passes(self):
    method test_create_template_fails (line 43) | def test_create_template_fails(self):

FILE: stacker/tests/conftest.py
  function aws_credentials (line 12) | def aws_credentials():
  function stacker_fixture_dir (line 40) | def stacker_fixture_dir():

FILE: stacker/tests/factories.py
  class MockThreadingEvent (line 8) | class MockThreadingEvent(object):
    method wait (line 9) | def wait(self, timeout=None):
  class MockProviderBuilder (line 13) | class MockProviderBuilder(object):
    method __init__ (line 14) | def __init__(self, provider, region=None):
    method build (line 18) | def build(self, region=None, profile=None):
  function mock_provider (line 22) | def mock_provider(**kwargs):
  function mock_context (line 26) | def mock_context(namespace="default", extra_config_args=None, **kwargs):
  function generate_definition (line 41) | def generate_definition(base_name, stack_id, **overrides):
  function mock_lookup (line 52) | def mock_lookup(lookup_input, lookup_type, raw=None):
  class SessionStub (line 58) | class SessionStub(object):
    method __init__ (line 77) | def __init__(self, client_stub):
    method client (line 80) | def client(self, region):

FILE: stacker/tests/fixtures/mock_blueprints.py
  class FunctionalTests (line 26) | class FunctionalTests(Blueprint):
    method create_template (line 43) | def create_template(self):
  class Dummy (line 186) | class Dummy(Blueprint):
    method create_template (line 193) | def create_template(self):
  class Dummy2 (line 199) | class Dummy2(Blueprint):
    method create_template (line 210) | def create_template(self):
  class LongRunningDummy (line 216) | class LongRunningDummy(Blueprint):
    method create_template (line 242) | def create_template(self):
  class Broken (line 272) | class Broken(Blueprint):
    method create_template (line 283) | def create_template(self):
  class VPC (line 295) | class VPC(Blueprint):
    method create_template (line 340) | def create_template(self):
  class DiffTester (line 344) | class DiffTester(Blueprint):
    method create_template (line 356) | def create_template(self):
  class Bastion (line 361) | class Bastion(Blueprint):
    method create_template (line 395) | def create_template(self):
  class PreOneOhBastion (line 399) | class PreOneOhBastion(Blueprint):
    method create_template (line 434) | def create_template(self):

FILE: stacker/tests/fixtures/mock_hooks.py
  function mock_hook (line 3) | def mock_hook(provider, context, **kwargs):

FILE: stacker/tests/fixtures/mock_lookups.py
  function handler (line 4) | def handler(value, **kwargs):

FILE: stacker/tests/hooks/test_aws_lambda.py
  function all_files (line 27) | def all_files(tmpdir):
  function f1_files (line 49) | def f1_files(tmpdir, all_files):
  function f2_files (line 54) | def f2_files(tmpdir, all_files):
  function prebuilt_zip (line 59) | def prebuilt_zip(stacker_fixture_dir):
  function s3 (line 67) | def s3():
  function assert_s3_zip_file_list (line 72) | def assert_s3_zip_file_list(s3, bucket, key, files, root=None):
  function assert_s3_zip_contents (line 91) | def assert_s3_zip_contents(s3, bucket, key, contents):
  function assert_s3_bucket (line 98) | def assert_s3_bucket(s3, bucket, present=True):
  function context (line 113) | def context():
  function provider (line 118) | def provider():
  function run_hook (line 123) | def run_hook(context, provider):
  function test_bucket_default (line 131) | def test_bucket_default(s3, context, run_hook):
  function test_bucket_custom (line 138) | def test_bucket_custom(s3, context, run_hook):
  function test_prefix (line 146) | def test_prefix(tmpdir, s3, all_files, f1_files, run_hook):
  function test_prefix_missing (line 164) | def test_prefix_missing(tmpdir, s3, all_files, f1_files, run_hook):
  function test_path_missing (line 183) | def test_path_missing(run_hook):
  function test_path_non_zip_non_dir (line 194) | def test_path_non_zip_non_dir(tmpdir, all_files, run_hook):
  function test_path_relative (line 207) | def test_path_relative(tmpdir, s3, run_hook):
  function test_path_home_relative (line 228) | def test_path_home_relative(tmpdir, s3, run_hook):
  function test_multiple_functions (line 254) | def test_multiple_functions(tmpdir, s3, all_files, f1_files, f2_files,
  function test_patterns_invalid (line 282) | def test_patterns_invalid(tmpdir, run_hook):
  function test_patterns_include (line 298) | def test_patterns_include(tmpdir, s3, all_files, run_hook):
  function test_patterns_exclude (line 322) | def test_patterns_exclude(tmpdir, s3, all_files, run_hook):
  function test_patterns_include_exclude (line 345) | def test_patterns_include_exclude(tmpdir, s3, all_files, run_hook):
  function test_patterns_exclude_all (line 365) | def test_patterns_exclude_all(tmpdir, all_files, run_hook):
  function test_idempotence (line 381) | def test_idempotence(tmpdir, s3, all_files, run_hook):
  function test_calculate_hash (line 408) | def test_calculate_hash(tmpdir, all_files, f1_files, f2_files):
  function test_calculate_hash_diff_filename_same_contents (line 422) | def test_calculate_hash_diff_filename_same_contents(tmpdir, all_files):
  function test_calculate_hash_different_ordering (line 435) | def test_calculate_hash_different_ordering(tmpdir, all_files):
  function test_select_bucket_region (line 474) | def test_select_bucket_region(case):
  function test_follow_symlink_nonbool (line 479) | def test_follow_symlink_nonbool(run_hook):
  function linked_dir (line 492) | def linked_dir(tmpdir):
  function test_follow_symlink_true (line 498) | def test_follow_symlink_true(tmpdir, s3, all_files, f1_files, run_hook,
  function test_follow_symlink_false (line 519) | def test_follow_symlink_false(tmpdir, s3, all_files, run_hook, linked_dir):

FILE: stacker/tests/hooks/test_command.py
  class MockProcess (line 15) | class MockProcess(object):
    method __init__ (line 16) | def __init__(self, returncode=0, stdout='', stderr=''):
    method communicate (line 22) | def communicate(self, stdin):
    method wait (line 26) | def wait(self):
    method kill (line 29) | def kill(self):
  class TestCommandHook (line 33) | class TestCommandHook(unittest.TestCase):
    method setUp (line 34) | def setUp(self):
    method tearDown (line 49) | def tearDown(self):
    method run_hook (line 53) | def run_hook(self, **kwargs):
    method test_command_ok (line 62) | def test_command_ok(self):
    method test_command_fail (line 74) | def test_command_fail(self):
    method test_command_ignore_status (line 85) | def test_command_ignore_status(self):
    method test_command_quiet (line 97) | def test_command_quiet(self):
    method test_command_interactive (line 110) | def test_command_interactive(self):
    method test_command_input (line 122) | def test_command_input(self):
    method test_command_capture (line 135) | def test_command_capture(self):
    method test_command_env (line 147) | def test_command_env(self):

FILE: stacker/tests/hooks/test_ecs.py
  class TestECSHooks (line 16) | class TestECSHooks(unittest.TestCase):
    method setUp (line 18) | def setUp(self):
    method test_create_single_cluster (line 22) | def test_create_single_cluster(self):
    method test_create_multiple_clusters (line 50) | def test_create_multiple_clusters(self):
    method test_fail_create_cluster (line 79) | def test_fail_create_cluster(self):

FILE: stacker/tests/hooks/test_iam.py
  class TestIAMHooks (line 28) | class TestIAMHooks(unittest.TestCase):
    method setUp (line 30) | def setUp(self):
    method test_get_cert_arn_from_response (line 34) | def test_get_cert_arn_from_response(self):
    method test_create_service_role (line 49) | def test_create_service_role(self):
    method test_create_service_role_already_exists (line 74) | def test_create_service_role_already_exists(self):

FILE: stacker/tests/hooks/test_keypair.py
  function ssh_key (line 22) | def ssh_key(stacker_fixture_dir):
  function provider (line 31) | def provider():
  function context (line 36) | def context():
  function ec2 (line 41) | def ec2(ssh_key):
  function ssm (line 54) | def ssm():
  function mock_input (line 60) | def mock_input(lines=(), isatty=True):
  function assert_key_present (line 67) | def assert_key_present(hook_result, key_name, fingerprint):
  function test_param_validation (line 80) | def test_param_validation(provider, context):
  function test_keypair_exists (line 87) | def test_keypair_exists(provider, context):
  function test_import_file (line 99) | def test_import_file(tmpdir, provider, context, ssh_key):
  function test_import_bad_key_data (line 109) | def test_import_bad_key_data(tmpdir, provider, context):
  function test_create_in_ssm (line 119) | def test_create_in_ssm(provider, context, ssh_key, ssm_key_id):
  function test_interactive_non_terminal_input (line 142) | def test_interactive_non_terminal_input(capsys, provider, context):
  function test_interactive_retry_cancel (line 154) | def test_interactive_retry_cancel(provider, context):
  function test_interactive_import (line 164) | def test_interactive_import(tmpdir, provider, context, ssh_key):
  function test_interactive_create (line 177) | def test_interactive_create(tmpdir, provider, context, ssh_key):
  function test_interactive_create_bad_dir (line 193) | def test_interactive_create_bad_dir(tmpdir, provider, context):
  function test_interactive_create_existing_file (line 204) | def test_interactive_create_existing_file(tmpdir, provider, context):

FILE: stacker/tests/lookups/handlers/test_ami.py
  class TestAMILookup (line 11) | class TestAMILookup(unittest.TestCase):
    method setUp (line 14) | def setUp(self):
    method test_basic_lookup_single_image (line 20) | def test_basic_lookup_single_image(self, mock_client):
    method test_basic_lookup_with_region (line 48) | def test_basic_lookup_with_region(self, mock_client):
    method test_basic_lookup_multiple_images (line 76) | def test_basic_lookup_multiple_images(self, mock_client):
    method test_basic_lookup_multiple_images_name_match (line 113) | def test_basic_lookup_multiple_images_name_match(self, mock_client):
    method test_basic_lookup_no_matching_images (line 150) | def test_basic_lookup_no_matching_images(self, mock_client):
    method test_basic_lookup_no_matching_images_from_name (line 167) | def test_basic_lookup_no_matching_images_from_name(self, mock_client):

FILE: stacker/tests/lookups/handlers/test_default.py
  class TestDefaultLookup (line 8) | class TestDefaultLookup(unittest.TestCase):
    method setUp (line 10) | def setUp(self):
    method test_env_var_present (line 18) | def test_env_var_present(self):
    method test_env_var_missing (line 25) | def test_env_var_missing(self):
    method test_invalid_value (line 32) | def test_invalid_value(self):

FILE: stacker/tests/lookups/handlers/test_dynamodb.py
  class TestDynamoDBHandler (line 9) | class TestDynamoDBHandler(unittest.TestCase):
    method setUp (line 12) | def setUp(self):
    method test_dynamodb_handler (line 23) | def test_dynamodb_handler(self, mock_client):
    method test_dynamodb_number_handler (line 42) | def test_dynamodb_number_handler(self, mock_client):
    method test_dynamodb_list_handler (line 62) | def test_dynamodb_list_handler(self, mock_client):
    method test_dynamodb_empty_table_handler (line 82) | def test_dynamodb_empty_table_handler(self, mock_client):
    method test_dynamodb_missing_table_handler (line 104) | def test_dynamodb_missing_table_handler(self, mock_client):
    method test_dynamodb_invalid_table_handler (line 125) | def test_dynamodb_invalid_table_handler(self, mock_client):
    method test_dynamodb_invalid_partition_key_handler (line 148) | def test_dynamodb_invalid_partition_key_handler(self, mock_client):
    method test_dynamodb_invalid_partition_val_handler (line 172) | def test_dynamodb_invalid_partition_val_handler(self, mock_client):

FILE: stacker/tests/lookups/handlers/test_envvar.py
  class TestEnvVarHandler (line 6) | class TestEnvVarHandler(unittest.TestCase):
    method setUp (line 8) | def setUp(self):
    method test_valid_envvar (line 14) | def test_valid_envvar(self):
    method test_invalid_envvar (line 18) | def test_invalid_envvar(self):

FILE: stacker/tests/lookups/handlers/test_file.py
  function to_template_dict (line 15) | def to_template_dict(obj):
  class TestFileTranslator (line 29) | class TestFileTranslator(unittest.TestCase):
    method assertTemplateEqual (line 31) | def assertTemplateEqual(left, right):
    method test_parameterized_codec_b64 (line 40) | def test_parameterized_codec_b64(self):
    method test_parameterized_codec_plain (line 49) | def test_parameterized_codec_plain(self):
    method test_parameterized_codec_plain_no_interpolation (line 56) | def test_parameterized_codec_plain_no_interpolation(self):
    method test_yaml_codec_raw (line 63) | def test_yaml_codec_raw(self):
    method test_yaml_codec_parameterized (line 75) | def test_yaml_codec_parameterized(self):
    method test_json_codec_raw (line 88) | def test_json_codec_raw(self):
    method test_json_codec_parameterized (line 97) | def test_json_codec_parameterized(self):
    method test_file_loaded (line 112) | def test_file_loaded(self, content_mock):
    method test_handler_plain (line 118) | def test_handler_plain(self, _):
    method test_handler_b64 (line 123) | def test_handler_b64(self, content_mock):
    method test_handler_parameterized (line 133) | def test_handler_parameterized(self, content_mock, codec_mock):
    method test_handler_parameterized_b64 (line 144) | def test_handler_parameterized_b64(self, content_mock, codec_mock):
    method test_handler_yaml (line 155) | def test_handler_yaml(self, content_mock, codec_mock):
    method test_handler_yaml_parameterized (line 167) | def test_handler_yaml_parameterized(self, content_mock, codec_mock):
    method test_handler_json (line 179) | def test_handler_json(self, content_mock, codec_mock):
    method test_handler_json_parameterized (line 191) | def test_handler_json_parameterized(self, content_mock, codec_mock):
    method test_unknown_codec (line 202) | def test_unknown_codec(self, _):

FILE: stacker/tests/lookups/handlers/test_hook_data.py
  class TestHookDataLookup (line 8) | class TestHookDataLookup(unittest.TestCase):
    method setUp (line 10) | def setUp(self):
    method test_valid_hook_data (line 14) | def test_valid_hook_data(self):
    method test_invalid_hook_data (line 18) | def test_invalid_hook_data(self):
    method test_bad_value_hook_data (line 22) | def test_bad_value_hook_data(self):

FILE: stacker/tests/lookups/handlers/test_output.py
  class TestOutputHandler (line 9) | class TestOutputHandler(unittest.TestCase):
    method setUp (line 11) | def setUp(self):
    method test_output_handler (line 14) | def test_output_handler(self):

FILE: stacker/tests/lookups/handlers/test_rxref.py
  class TestRxrefHandler (line 9) | class TestRxrefHandler(unittest.TestCase):
    method setUp (line 11) | def setUp(self):
    method test_rxref_handler (line 17) | def test_rxref_handler(self):

FILE: stacker/tests/lookups/handlers/test_split.py
  class TestSplitLookup (line 6) | class TestSplitLookup(unittest.TestCase):
    method test_single_character_split (line 7) | def test_single_character_split(self):
    method test_multi_character_split (line 12) | def test_multi_character_split(self):
    method test_invalid_value_split (line 17) | def test_invalid_value_split(self):

FILE: stacker/tests/lookups/handlers/test_ssmstore.py
  class TestSSMStoreHandler (line 9) | class TestSSMStoreHandler(unittest.TestCase):
    method setUp (line 12) | def setUp(self):
    method test_ssmstore_handler (line 40) | def test_ssmstore_handler(self, mock_client):
    method test_ssmstore_invalid_value_handler (line 51) | def test_ssmstore_invalid_value_handler(self, mock_client):
    method test_ssmstore_handler_with_region (line 63) | def test_ssmstore_handler_with_region(self, mock_client):

FILE: stacker/tests/lookups/handlers/test_xref.py
  class TestXrefHandler (line 7) | class TestXrefHandler(unittest.TestCase):
    method setUp (line 9) | def setUp(self):
    method test_xref_handler (line 13) | def test_xref_handler(self):

FILE: stacker/tests/lookups/test_registry.py
  class TestRegistry (line 20) | class TestRegistry(unittest.TestCase):
    method setUp (line 21) | def setUp(self):
    method test_autoloaded_lookup_handlers (line 25) | def test_autoloaded_lookup_handlers(self):
    method test_resolve_lookups_string_unknown_lookup (line 39) | def test_resolve_lookups_string_unknown_lookup(self):
    method test_resolve_lookups_list_unknown_lookup (line 43) | def test_resolve_lookups_list_unknown_lookup(self):
    method resolve_lookups_with_output_handler_raise_valueerror (line 51) | def resolve_lookups_with_output_handler_raise_valueerror(self, variable):
    method test_resolve_lookups_string_failed_variable_lookup (line 67) | def test_resolve_lookups_string_failed_variable_lookup(self):
    method test_resolve_lookups_list_failed_variable_lookup (line 71) | def test_resolve_lookups_list_failed_variable_lookup(self):

FILE: stacker/tests/providers/aws/test_default.py
  function random_string (line 39) | def random_string(length=12):
  function generate_describe_stacks_stack (line 53) | def generate_describe_stacks_stack(stack_name,
  function generate_get_template (line 67) | def generate_get_template(file_name='cfn_template.json',
  function generate_stack_object (line 77) | def generate_stack_object(stack_name, outputs=None):
  function generate_resource_change (line 94) | def generate_resource_change(replacement=True):
  function generate_change_set_response (line 110) | def generate_change_set_response(status, execution_status="AVAILABLE",
  function generate_change (line 147) | def generate_change(action="Modify", resource_type="EC2::Instance",
  class TestMethods (line 175) | class TestMethods(unittest.TestCase):
    method setUp (line 176) | def setUp(self):
    method test_requires_replacement (line 180) | def test_requires_replacement(self):
    method test_summarize_params_diff (line 191) | def test_summarize_params_diff(self):
    method test_ask_for_approval (line 224) | def test_ask_for_approval(self):
    method test_ask_for_approval_with_params_diff (line 243) | def test_ask_for_approval_with_params_diff(self):
    method test_output_full_changeset (line 268) | def test_output_full_changeset(self, mock_safe_dump, patched_format):
    method test_wait_till_change_set_complete_success (line 301) | def test_wait_till_change_set_complete_success(self):
    method test_wait_till_change_set_complete_failed (line 316) | def test_wait_till_change_set_complete_failed(self):
    method test_create_change_set_stack_did_not_change (line 328) | def test_create_change_set_stack_did_not_change(self):
    method test_create_change_set_unhandled_failed_status (line 355) | def test_create_change_set_unhandled_failed_status(self):
    method test_create_change_set_bad_execution_status (line 376) | def test_create_change_set_bad_execution_status(self):
    method test_generate_cloudformation_args (line 397) | def test_generate_cloudformation_args(self):
    method test_generate_cloudformation_args_with_notification_arns (line 445) | def test_generate_cloudformation_args_with_notification_arns(self):
  class TestProviderDefaultMode (line 471) | class TestProviderDefaultMode(unittest.TestCase):
    method setUp (line 472) | def setUp(self):
    method test_get_stack_stack_does_not_exist (line 479) | def test_get_stack_stack_does_not_exist(self):
    method test_get_stack_stack_exists (line 492) | def test_get_stack_stack_exists(self):
    method test_select_update_method (line 508) | def test_select_update_method(self):
    method test_prepare_stack_for_update_completed (line 526) | def test_prepare_stack_for_update_completed(self):
    method test_prepare_stack_for_update_in_progress (line 535) | def test_prepare_stack_for_update_in_progress(self):
    method test_prepare_stack_for_update_non_recreatable (line 546) | def test_prepare_stack_for_update_non_recreatable(self):
    method test_prepare_stack_for_update_disallowed (line 557) | def test_prepare_stack_for_update_disallowed(self):
    method test_prepare_stack_for_update_bad_tags (line 570) | def test_prepare_stack_for_update_bad_tags(self):
    method test_prepare_stack_for_update_recreate (line 585) | def test_prepare_stack_for_update_recreate(self):
    method test_noninteractive_changeset_update_no_stack_policy (line 602) | def test_noninteractive_changeset_update_no_stack_policy(self):
    method test_noninteractive_changeset_update_with_stack_policy (line 630) | def test_noninteractive_changeset_update_with_stack_policy(self):
    method test_get_stack_changes_update (line 661) | def test_get_stack_changes_update(self, mock_output_full_cs):
    method test_get_stack_changes_create (line 713) | def test_get_stack_changes_create(self, mock_output_full_cs):
    method test_tail_stack_retry_on_missing_stack (line 764) | def test_tail_stack_retry_on_missing_stack(self):
    method test_tail_stack_retry_on_missing_stack_eventual_success (line 790) | def test_tail_stack_retry_on_missing_stack_eventual_success(self):
  class TestProviderInteractiveMode (line 850) | class TestProviderInteractiveMode(unittest.TestCase):
    method setUp (line 851) | def setUp(self):
    method test_successful_init (line 858) | def test_successful_init(self):
    method test_update_stack_execute_success_no_stack_policy (line 865) | def test_update_stack_execute_success_no_stack_policy(self,
    method test_update_stack_execute_success_with_stack_policy (line 902) | def test_update_stack_execute_success_with_stack_policy(self,
    method test_select_update_method (line 941) | def test_select_update_method(self):
    method test_get_stack_changes_interactive (line 961) | def test_get_stack_changes_interactive(self, mock_output_summary,

FILE: stacker/tests/test_config.py
  class TestConfig (line 28) | class TestConfig(unittest.TestCase):
    method test_render_missing_env (line 29) | def test_render_missing_env(self):
    method test_render_no_variable_config (line 35) | def test_render_no_variable_config(self):
    method test_render_valid_env_substitution (line 39) | def test_render_valid_env_substitution(self):
    method test_render_blank_env_values (line 43) | def test_render_blank_env_values(self):
    method test_render_yaml (line 52) | def test_render_yaml(self):
    method test_render_yaml_errors (line 112) | def test_render_yaml_errors(self):
    method test_config_validate_missing_stack_source (line 130) | def test_config_validate_missing_stack_source(self):
    method test_config_validate_missing_stack_source_when_locked (line 147) | def test_config_validate_missing_stack_source_when_locked(self):
    method test_config_validate_stack_class_and_template_paths (line 156) | def test_config_validate_stack_class_and_template_paths(self):
    method test_config_validate_missing_name (line 175) | def test_config_validate_missing_name(self):
    method test_config_validate_duplicate_stack_names (line 189) | def test_config_validate_duplicate_stack_names(self):
    method test_dump_unicode (line 207) | def test_dump_unicode(self):
    method test_parse_tags (line 223) | def test_parse_tags(self):
    method test_parse_with_arbitrary_anchors (line 236) | def test_parse_with_arbitrary_anchors(self):
    method test_parse_with_deprecated_parameters (line 251) | def test_parse_with_deprecated_parameters(self):
    method test_config_build (line 271) | def test_config_build(self):
    method test_parse (line 279) | def test_parse(self):
    method test_dump_complex (line 499) | def test_dump_complex(self):
    method test_load_register_custom_lookups (line 527) | def test_load_register_custom_lookups(self):
    method test_load_adds_sys_path (line 534) | def test_load_adds_sys_path(self):
    method test_process_empty_remote_sources (line 539) | def test_process_empty_remote_sources(self):
    method test_lookup_with_sys_path (line 548) | def test_lookup_with_sys_path(self):
    method test_render_parse_load_namespace_fallback (line 556) | def test_render_parse_load_namespace_fallback(self):
    method test_allow_most_keys_to_be_duplicates_for_overrides (line 567) | def test_allow_most_keys_to_be_duplicates_for_overrides(self):
    method test_raise_constructor_error_on_keyword_duplicate_key (line 597) | def test_raise_constructor_error_on_keyword_duplicate_key(self):
    method test_raise_construct_error_on_duplicate_stack_name_dict (line 610) | def test_raise_construct_error_on_duplicate_stack_name_dict(self):
    method test_parse_invalid_inner_keys (line 624) | def test_parse_invalid_inner_keys(self):

FILE: stacker/tests/test_context.py
  class TestContext (line 8) | class TestContext(unittest.TestCase):
    method setUp (line 10) | def setUp(self):
    method test_context_optional_keys_set (line 16) | def test_context_optional_keys_set(self):
    method test_context_get_stacks (line 24) | def test_context_get_stacks(self):
    method test_context_get_stacks_dict_use_fqn (line 28) | def test_context_get_stacks_dict_use_fqn(self):
    method test_context_get_fqn (line 35) | def test_context_get_fqn(self):
    method test_context_get_fqn_replace_dot (line 40) | def test_context_get_fqn_replace_dot(self):
    method test_context_get_fqn_empty_namespace (line 45) | def test_context_get_fqn_empty_namespace(self):
    method test_context_namespace (line 51) | def test_context_namespace(self):
    method test_context_get_fqn_stack_name (line 55) | def test_context_get_fqn_stack_name(self):
    method test_context_default_bucket_name (line 60) | def test_context_default_bucket_name(self):
    method test_context_bucket_name_is_overriden_but_is_none (line 64) | def test_context_bucket_name_is_overriden_but_is_none(self):
    method test_context_bucket_name_is_overriden (line 73) | def test_context_bucket_name_is_overriden(self):
    method test_context_default_bucket_no_namespace (line 78) | def test_context_default_bucket_no_namespace(self):
    method test_context_namespace_delimiter_is_overriden_and_not_none (line 89) | def test_context_namespace_delimiter_is_overriden_and_not_none(self):
    method test_context_namespace_delimiter_is_overriden_and_is_empty (line 95) | def test_context_namespace_delimiter_is_overriden_and_is_empty(self):
    method test_context_tags_with_empty_map (line 101) | def test_context_tags_with_empty_map(self):
    method test_context_no_tags_specified (line 106) | def test_context_no_tags_specified(self):
    method test_hook_with_sys_path (line 111) | def test_hook_with_sys_path(self):
  class TestFunctions (line 129) | class TestFunctions(unittest.TestCase):
    method test_get_fqn_redundant_base (line 131) | def test_get_fqn_redundant_base(self):
    method test_get_fqn_only_base (line 138) | def test_get_fqn_only_base(self):
    method test_get_fqn_full (line 144) | def test_get_fqn_full(self):

FILE: stacker/tests/test_dag.py
  function empty_dag (line 15) | def empty_dag():
  function basic_dag (line 20) | def basic_dag():
  function test_add_node (line 29) | def test_add_node(empty_dag):
  function test_transpose (line 36) | def test_transpose(basic_dag):
  function test_add_edge (line 46) | def test_add_edge(empty_dag):
  function test_from_dict (line 55) | def test_from_dict(empty_dag):
  function test_reset_graph (line 68) | def test_reset_graph(empty_dag):
  function test_walk (line 77) | def test_walk(empty_dag):
  function test_ind_nodes (line 96) | def test_ind_nodes(basic_dag):
  function test_topological_sort (line 101) | def test_topological_sort(empty_dag):
  function test_successful_validation (line 109) | def test_successful_validation(basic_dag):
  function test_failed_validation (line 114) | def test_failed_validation(empty_dag):
  function test_downstream (line 122) | def test_downstream(basic_dag):
  function test_all_downstreams (line 127) | def test_all_downstreams(basic_dag):
  function test_all_downstreams_pass_graph (line 135) | def test_all_downstreams_pass_graph(empty_dag):
  function test_predecessors (line 146) | def test_predecessors(basic_dag):
  function test_filter (line 155) | def test_filter(basic_dag):
  function test_all_leaves (line 164) | def test_all_leaves(basic_dag):
  function test_size (line 170) | def test_size(basic_dag):
  function test_transitive_reduction_no_reduction (line 178) | def test_transitive_reduction_no_reduction(empty_dag):
  function test_transitive_reduction (line 191) | def test_transitive_reduction(empty_dag):
  function test_transitive_deep_reduction (line 208) | def test_transitive_deep_reduction(empty_dag):
  function test_threaded_walker (line 225) | def test_threaded_walker(empty_dag):

FILE: stacker/tests/test_environment.py
  class TestEnvironment (line 30) | class TestEnvironment(unittest.TestCase):
    method test_simple_key_value_parsing (line 32) | def test_simple_key_value_parsing(self):
    method test_simple_key_value_parsing_exception (line 42) | def test_simple_key_value_parsing_exception(self):
    method test_blank_value (line 46) | def test_blank_value(self):

FILE: stacker/tests/test_lookups.py
  class TestLookupExtraction (line 6) | class TestLookupExtraction(unittest.TestCase):
    method test_no_lookups (line 8) | def test_no_lookups(self):
    method test_single_lookup_string (line 12) | def test_single_lookup_string(self):
    method test_multiple_lookups_string (line 16) | def test_multiple_lookups_string(self):
    method test_lookups_list (line 24) | def test_lookups_list(self):
    method test_lookups_dict (line 31) | def test_lookups_dict(self):
    method test_lookups_mixed (line 38) | def test_lookups_mixed(self):
    method test_nested_lookups_string (line 49) | def test_nested_lookups_string(self):
    method test_comma_delimited (line 55) | def test_comma_delimited(self):
    method test_kms_lookup (line 59) | def test_kms_lookup(self):
    method test_kms_lookup_with_equals (line 66) | def test_kms_lookup_with_equals(self):
    method test_kms_lookup_with_region (line 73) | def test_kms_lookup_with_region(self):
    method test_kms_file_lookup (line 80) | def test_kms_file_lookup(self):
    method test_valid_extract_lookups_from_string (line 87) | def test_valid_extract_lookups_from_string(self):

FILE: stacker/tests/test_parse_user_data.py
  class TestCfTokenize (line 8) | class TestCfTokenize(unittest.TestCase):
    method test_tokenize (line 9) | def test_tokenize(self):

FILE: stacker/tests/test_plan.py
  class TestStep (line 38) | class TestStep(unittest.TestCase):
    method setUp (line 40) | def setUp(self):
    method test_status (line 46) | def test_status(self):
  class TestPlan (line 66) | class TestPlan(unittest.TestCase):
    method setUp (line 68) | def setUp(self):
    method tearDown (line 74) | def tearDown(self):
    method test_plan (line 77) | def test_plan(self):
    method test_execute_plan (line 93) | def test_execute_plan(self):
    method test_execute_plan_locked (line 114) | def test_execute_plan_locked(self):
    method test_execute_plan_filtered (line 138) | def test_execute_plan_filtered(self):
    method test_execute_plan_exception (line 166) | def test_execute_plan_exception(self):
    method test_execute_plan_skipped (line 194) | def test_execute_plan_skipped(self):
    method test_execute_plan_failed (line 219) | def test_execute_plan_failed(self):
    method test_execute_plan_cancelled (line 252) | def test_execute_plan_cancelled(self):
    method test_build_graph_missing_dependency (line 277) | def test_build_graph_missing_dependency(self):
    method test_build_graph_cyclic_dependencies (line 293) | def test_build_graph_cyclic_dependencies(self):
    method test_dump (line 314) | def test_dump(self, *args):

FILE: stacker/tests/test_stack.py
  class TestStack (line 11) | class TestStack(unittest.TestCase):
    method setUp (line 13) | def setUp(self):
    method test_stack_requires (line 23) | def test_stack_requires(self):
    method test_stack_requires_circular_ref (line 49) | def test_stack_requires_circular_ref(self):
    method test_stack_cfn_parameters (line 61) | def test_stack_cfn_parameters(self):
    method test_stack_tags_default (line 78) | def test_stack_tags_default(self):
    method test_stack_tags_override (line 87) | def test_stack_tags_override(self):
    method test_stack_tags_extra (line 97) | def test_stack_tags_extra(self):

FILE: stacker/tests/test_stacker.py
  class TestStacker (line 7) | class TestStacker(unittest.TestCase):
    method test_stacker_build_parse_args (line 9) | def test_stacker_build_parse_args(self):
    method test_stacker_build_parse_args_region_from_env (line 23) | def test_stacker_build_parse_args_region_from_env(self):
    method test_stacker_build_context_passed_to_blueprint (line 33) | def test_stacker_build_context_passed_to_blueprint(self):
    method test_stacker_blueprint_property_access_does_not_reset_blueprint (line 53) | def test_stacker_blueprint_property_access_does_not_reset_blueprint(se...
    method test_stacker_build_context_stack_names_specified (line 67) | def test_stacker_build_context_stack_names_specified(self):
    method test_stacker_build_fail_when_parameters_in_stack_def (line 81) | def test_stacker_build_fail_when_parameters_in_stack_def(self):
    method test_stacker_build_custom_info_log_format (line 92) | def test_stacker_build_custom_info_log_format(self):

FILE: stacker/tests/test_util.py
  function mock_create_cache_directories (line 42) | def mock_create_cache_directories(self, **kwargs):
  class TestUtil (line 47) | class TestUtil(unittest.TestCase):
    method test_cf_safe_name (line 49) | def test_cf_safe_name(self):
    method test_load_object_from_string (line 58) | def test_load_object_from_string(self):
    method test_camel_to_snake (line 67) | def test_camel_to_snake(self):
    method test_merge_map (line 77) | def test_merge_map(self):
    method test_yaml_to_ordered_dict (line 109) | def test_yaml_to_ordered_dict(self):
    method test_get_client_region (line 121) | def test_get_client_region(self):
    method test_get_s3_endpoint (line 127) | def test_get_s3_endpoint(self):
    method test_s3_bucket_location_constraint (line 133) | def test_s3_bucket_location_constraint(self):
    method test_parse_cloudformation_template (line 144) | def test_parse_cloudformation_template(self):
    method test_extractors (line 183) | def test_extractors(self):
    method test_SourceProcessor_helpers (line 192) | def test_SourceProcessor_helpers(self):
  function mock_hook (line 276) | def mock_hook(*args, **kwargs):
  function fail_hook (line 281) | def fail_hook(*args, **kwargs):
  function exception_hook (line 285) | def exception_hook(*args, **kwargs):
  function context_hook (line 289) | def context_hook(*args, **kwargs):
  function result_hook (line 293) | def result_hook(*args, **kwargs):
  class TestHooks (line 297) | class TestHooks(unittest.TestCase):
    method setUp (line 299) | def setUp(self):
    method test_empty_hook_stage (line 303) | def test_empty_hook_stage(self):
    method test_missing_required_hook (line 308) | def test_missing_required_hook(self):
    method test_missing_required_hook_method (line 313) | def test_missing_required_hook_method(self):
    method test_missing_non_required_hook_method (line 318) | def test_missing_non_required_hook_method(self):
    method test_default_required_hook (line 323) | def test_default_required_hook(self):
    method test_valid_hook (line 328) | def test_valid_hook(self):
    method test_valid_enabled_hook (line 338) | def test_valid_enabled_hook(self):
    method test_valid_enabled_false_hook (line 348) | def test_valid_enabled_false_hook(self):
    method test_context_provided_to_hook (line 355) | def test_context_provided_to_hook(self):
    method test_hook_failure (line 361) | def test_hook_failure(self):
    method test_return_data_hook (line 377) | def test_return_data_hook(self):
    method test_return_data_hook_duplicate_key (line 399) | def test_return_data_hook_duplicate_key(self):
  class TestException1 (line 415) | class TestException1(Exception):
  class TestException2 (line 419) | class TestException2(Exception):
  class TestExceptionRetries (line 423) | class TestExceptionRetries(unittest.TestCase):
    method setUp (line 424) | def setUp(self):
    method _works_immediately (line 427) | def _works_immediately(self, a, b, x=None, y=None):
    method _works_second_attempt (line 431) | def _works_second_attempt(self, a, b, x=None, y=None):
    method _second_raises_exception2 (line 437) | def _second_raises_exception2(self, a, b, x=None, y=None):
    method _throws_exception2 (line 443) | def _throws_exception2(self, a, b, x=None, y=None):

FILE: stacker/tests/test_variables.py
  class TestVariables (line 16) | class TestVariables(unittest.TestCase):
    method setUp (line 18) | def setUp(self):
    method test_variable_replace_no_lookups (line 22) | def test_variable_replace_no_lookups(self):
    method test_variable_replace_simple_lookup (line 26) | def test_variable_replace_simple_lookup(self):
    method test_variable_resolve_simple_lookup (line 31) | def test_variable_resolve_simple_lookup(self):
    method test_variable_resolve_default_lookup_empty (line 47) | def test_variable_resolve_default_lookup_empty(self):
    method test_variable_replace_multiple_lookups_string (line 53) | def test_variable_replace_multiple_lookups_string(self):
    method test_variable_resolve_multiple_lookups_string (line 65) | def test_variable_resolve_multiple_lookups_string(self):
    method test_variable_replace_no_lookups_list (line 85) | def test_variable_replace_no_lookups_list(self):
    method test_variable_replace_lookups_list (line 89) | def test_variable_replace_lookups_list(self):
    method test_variable_replace_lookups_dict (line 100) | def test_variable_replace_lookups_dict(self):
    method test_variable_replace_lookups_mixed (line 111) | def test_variable_replace_lookups_mixed(self):
    method test_variable_resolve_nested_lookup (line 140) | def test_variable_resolve_nested_lookup(self):
    method test_troposphere_type_no_from_dict (line 162) | def test_troposphere_type_no_from_dict(self):
    method test_troposphere_type_create (line 169) | def test_troposphere_type_create(self):
    method test_troposphere_type_create_multiple (line 176) | def test_troposphere_type_create_multiple(self):

FILE: stacker/tokenize_userdata.py
  function cf_tokenize (line 19) | def cf_tokenize(s):

FILE: stacker/ui.py
  function get_raw_input (line 9) | def get_raw_input(message):
  class UI (line 14) | class UI(object):
    method __init__ (line 21) | def __init__(self):
    method lock (line 24) | def lock(self, *args, **kwargs):
    method unlock (line 29) | def unlock(self, *args, **kwargs):
    method info (line 32) | def info(self, *args, **kwargs):
    method ask (line 41) | def ask(self, message):
    method getpass (line 52) | def getpass(self, *args):

FILE: stacker/util.py
  function camel_to_snake (line 29) | def camel_to_snake(name):
  function convert_class_name (line 42) | def convert_class_name(kls):
  function parse_zone_id (line 54) | def parse_zone_id(full_zone_id):
  function get_hosted_zone_by_name (line 59) | def get_hosted_zone_by_name(client, zone_name):
  function get_or_create_hosted_zone (line 79) | def get_or_create_hosted_zone(client, zone_name):
  class SOARecordText (line 104) | class SOARecordText(object):
    method __init__ (line 106) | def __init__(self, record_text):
    method __str__ (line 110) | def __str__(self):
  class SOARecord (line 117) | class SOARecord(object):
    method __init__ (line 119) | def __init__(self, record):
  function get_soa_record (line 125) | def get_soa_record(client, zone_id, zone_name):
  function create_route53_zone (line 146) | def create_route53_zone(client, zone_name):
  function load_object_from_string (line 197) | def load_object_from_string(fqcn):
  function merge_map (line 215) | def merge_map(a, b):
  function yaml_to_ordered_dict (line 232) | def yaml_to_ordered_dict(stream, loader=yaml.SafeLoader):
  function uppercase_first_letter (line 318) | def uppercase_first_letter(s):
  function cf_safe_name (line 323) | def cf_safe_name(name):
  function get_config_directory (line 334) | def get_config_directory():
  function read_value_from_path (line 347) | def read_value_from_path(value):
  function get_client_region (line 364) | def get_client_region(client):
  function get_s3_endpoint (line 378) | def get_s3_endpoint(client):
  function s3_bucket_location_constraint (line 392) | def s3_bucket_location_constraint(region):
  function ensure_s3_bucket (line 410) | def ensure_s3_bucket(s3_client, bucket_name, bucket_region):
  function parse_cloudformation_template (line 445) | def parse_cloudformation_template(template):
  class Extractor (line 456) | class Extractor(object):
    method __init__ (line 459) | def __init__(self, archive=None):
    method set_archive (line 468) | def set_archive(self, dir_name):
    method extension (line 478) | def extension():
  class TarExtractor (line 483) | class TarExtractor(Extractor):
    method extract (line 486) | def extract(self, destination):
    method extension (line 492) | def extension():
  class TarGzipExtractor (line 497) | class TarGzipExtractor(Extractor):
    method extract (line 500) | def extract(self, destination):
    method extension (line 506) | def extension():
  class ZipExtractor (line 511) | class ZipExtractor(Extractor):
    method extract (line 514) | def extract(self, destination):
    method extension (line 520) | def extension():
  class SourceProcessor (line 525) | class SourceProcessor(object):
    method __init__ (line 530) | def __init__(self, sources, stacker_cache_dir=None):
    method create_cache_directories (line 548) | def create_cache_directories(self):
    method get_package_sources (line 555) | def get_package_sources(self):
    method fetch_local_package (line 567) | def fetch_local_package(self, config):
    method fetch_s3_package (line 579) | def fetch_s3_package(self, config):
    method fetch_git_package (line 677) | def fetch_git_package(self, config):
    method update_paths_and_config (line 717) | def update_paths_and_config(self, config, pkg_dir_name,
    method git_ls_remote (line 750) | def git_ls_remote(self, uri, ref):
    method determine_git_ls_remote_ref (line 773) | def determine_git_ls_remote_ref(self, config):
    method determine_git_ref (line 791) | def determine_git_ref(self, config):
    method sanitize_uri_path (line 829) | def sanitize_uri_path(self, uri):
    method sanitize_git_path (line 843) | def sanitize_git_path(self, uri, ref=None):
  function stack_template_key_name (line 865) | def stack_template_key_name(blueprint):

FILE: stacker/variables.py
  class LookupTemplate (line 13) | class LookupTemplate(Template):
  function resolve_variables (line 19) | def resolve_variables(variables, context, provider):
  class Variable (line 34) | class Variable(object):
    method __init__ (line 43) | def __init__(self, name, value):
    method value (line 49) | def value(self):
    method resolved (line 60) | def resolved(self):
    method resolve (line 67) | def resolve(self, context, provider):
    method dependencies (line 82) | def dependencies(self):
  class VariableValue (line 90) | class VariableValue(object):
    method value (line 94) | def value(self):
    method __iter__ (line 97) | def __iter__(self):
    method resolved (line 100) | def resolved(self):
    method resolve (line 107) | def resolve(self, context, provider):
    method dependencies (line 110) | def dependencies(self):
    method simplified (line 113) | def simplified(self):
    method parse (line 125) | def parse(cls, input_object):
  class VariableValueLiteral (line 174) | class VariableValueLiteral(VariableValue):
    method __init__ (line 175) | def __init__(self, value):
    method value (line 178) | def value(self):
    method __iter__ (line 181) | def __iter__(self):
    method resolved (line 184) | def resolved(self):
    method __repr__ (line 187) | def __repr__(self):
  class VariableValueList (line 191) | class VariableValueList(VariableValue, list):
    method parse (line 193) | def parse(cls, input_object):
    method value (line 200) | def value(self):
    method resolved (line 206) | def resolved(self):
    method __repr__ (line 212) | def __repr__(self):
    method __iter__ (line 215) | def __iter__(self):
    method resolve (line 218) | def resolve(self, context, provider):
    method dependencies (line 222) | def dependencies(self):
    method simplified (line 228) | def simplified(self):
  class VariableValueDict (line 235) | class VariableValueDict(VariableValue, dict):
    method parse (line 237) | def parse(cls, input_object):
    method value (line 244) | def value(self):
    method resolved (line 250) | def resolved(self):
    method __repr__ (line 256) | def __repr__(self):
    method __iter__ (line 261) | def __iter__(self):
    method resolve (line 264) | def resolve(self, context, provider):
    method dependencies (line 268) | def dependencies(self):
    method simplified (line 274) | def simplified(self):
  class VariableValueConcatenation (line 281) | class VariableValueConcatenation(VariableValue, list):
    method value (line 282) | def value(self):
    method __iter__ (line 294) | def __iter__(self):
    method resolved (line 297) | def resolved(self):
    method __repr__ (line 303) | def __repr__(self):
    method resolve (line 306) | def resolve(self, context, provider):
    method dependencies (line 310) | def dependencies(self):
    method simplified (line 316) | def simplified(self):
  class VariableValueLookup (line 346) | class VariableValueLookup(VariableValue):
    method __init__ (line 347) | def __init__(self, lookup_name, lookup_data, handler=None):
    method resolve (line 370) | def resolve(self, context, provider):
    method _resolve (line 390) | def _resolve(self, value):
    method dependencies (line 394) | def dependencies(self):
    method value (line 400) | def value(self):
    method __iter__ (line 406) | def __iter__(self):
    method resolved (line 409) | def resolved(self):
    method __repr__ (line 412) | def __repr__(self):
    method __str__ (line 425) | def __str__(self):
    method simplified (line 431) | def simplified(self):
Condensed preview — 226 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (871K chars).
[
  {
    "path": ".circleci/config.yml",
    "chars": 2873,
    "preview": "version: 2\n\nworkflows:\n  version: 2\n  test-all:\n    jobs:\n      - lint\n      - unit-test-37:\n          requires:\n       "
  },
  {
    "path": ".dockerignore",
    "chars": 11,
    "preview": "Dockerfile\n"
  },
  {
    "path": ".gitignore",
    "chars": 759,
    "preview": "# Compiled source #\n###################\n*.com\n*.class\n*.dll\n*.exe\n*.o\n*.so\n\n# Packages #\n############\n# it's better to u"
  },
  {
    "path": "AUTHORS.rst",
    "chars": 1318,
    "preview": "Authors\n=======\n\nStacker was designed and developed by the OpsEng team at `Remind, Inc.`_\n\nCurrent Maintainers\n---------"
  },
  {
    "path": "CHANGELOG.md",
    "chars": 18683,
    "preview": "## Upcoming release\n\n## 1.7.2 (2020-11-09)\n- address breaking moto change to awslambda [GH-763]\n- Added Python version v"
  },
  {
    "path": "CODE_OF_CONDUCT.md",
    "chars": 3237,
    "preview": "# Contributor Covenant Code of Conduct\n\n## Our Pledge\n\nIn the interest of fostering an open and welcoming environment, w"
  },
  {
    "path": "CONTRIBUTING.md",
    "chars": 3169,
    "preview": "# Contributing\n\nContributions are welcome, and they are greatly appreciated!\n\nYou can contribute in many ways:\n\n## Types"
  },
  {
    "path": "Dockerfile",
    "chars": 368,
    "preview": "FROM python:2.7.10\nMAINTAINER Mike Barrett\n\nCOPY scripts/docker-stacker /bin/docker-stacker\nRUN mkdir -p /stacks && pip "
  },
  {
    "path": "LICENSE",
    "chars": 1303,
    "preview": "Copyright (c) 2015, Remind101, Inc.\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or wit"
  },
  {
    "path": "Makefile",
    "chars": 550,
    "preview": ".PHONY: build lint test-unit test-functional test\n\nbuild:\n\tdocker build -t remind101/stacker .\n\nlint:\n\tflake8 --ignore E"
  },
  {
    "path": "README.rst",
    "chars": 3644,
    "preview": "=======\nstacker\n=======\n\n.. image:: https://readthedocs.org/projects/stacker/badge/?version=latest\n   :target: http://st"
  },
  {
    "path": "RELEASE.md",
    "chars": 1029,
    "preview": "# Steps to release a new version\n\n## Preparing for the release\n\n- Check out a branch named for the version: `git checkou"
  },
  {
    "path": "codecov.yml",
    "chars": 15,
    "preview": "comment: false\n"
  },
  {
    "path": "conf/README.rst",
    "chars": 157,
    "preview": "Please check out the stacker_blueprints_ repo for example configs and \nblueprints.\n\n.. _stacker_blueprints: https://gith"
  },
  {
    "path": "docs/.gitignore",
    "chars": 7,
    "preview": "_build\n"
  },
  {
    "path": "docs/Makefile",
    "chars": 7094,
    "preview": "# Makefile for Sphinx documentation\n#\n\n# You can set these variables from the command line.\nSPHINXOPTS    =\nSPHINXBUILD "
  },
  {
    "path": "docs/api/modules.rst",
    "chars": 58,
    "preview": "stacker\n=======\n\n.. toctree::\n   :maxdepth: 4\n\n   stacker\n"
  },
  {
    "path": "docs/api/stacker.actions.rst",
    "chars": 989,
    "preview": "stacker\\.actions package\n========================\n\nSubmodules\n----------\n\nstacker\\.actions\\.base module\n----------------"
  },
  {
    "path": "docs/api/stacker.blueprints.rst",
    "chars": 621,
    "preview": "stacker\\.blueprints package\n===========================\n\nSubpackages\n-----------\n\n.. toctree::\n\n    stacker.blueprints.v"
  },
  {
    "path": "docs/api/stacker.blueprints.variables.rst",
    "chars": 438,
    "preview": "stacker\\.blueprints\\.variables package\n======================================\n\nSubmodules\n----------\n\nstacker\\.blueprint"
  },
  {
    "path": "docs/api/stacker.commands.rst",
    "chars": 245,
    "preview": "stacker\\.commands package\n=========================\n\nSubpackages\n-----------\n\n.. toctree::\n\n    stacker.commands.stacker"
  },
  {
    "path": "docs/api/stacker.commands.stacker.rst",
    "chars": 1163,
    "preview": "stacker\\.commands\\.stacker package\n==================================\n\nSubmodules\n----------\n\nstacker\\.commands\\.stacker"
  },
  {
    "path": "docs/api/stacker.config.rst",
    "chars": 241,
    "preview": "stacker\\.config package\n=======================\n\nSubpackages\n-----------\n\n.. toctree::\n\n    stacker.config.translators\n\n"
  },
  {
    "path": "docs/api/stacker.config.translators.rst",
    "chars": 420,
    "preview": "stacker\\.config\\.translators package\n====================================\n\nSubmodules\n----------\n\nstacker\\.config\\.trans"
  },
  {
    "path": "docs/api/stacker.hooks.rst",
    "chars": 1126,
    "preview": "stacker\\.hooks package\n======================\n\nSubmodules\n----------\n\nstacker\\.hooks\\.aws\\_lambda module\n---------------"
  },
  {
    "path": "docs/api/stacker.logger.rst",
    "chars": 524,
    "preview": "stacker\\.logger package\n=======================\n\nSubmodules\n----------\n\nstacker\\.logger\\.formatter module\n--------------"
  },
  {
    "path": "docs/api/stacker.lookups.handlers.rst",
    "chars": 2508,
    "preview": "stacker\\.lookups\\.handlers package\n==================================\n\nSubmodules\n----------\n\nstacker\\.lookups\\.handlers"
  },
  {
    "path": "docs/api/stacker.lookups.rst",
    "chars": 434,
    "preview": "stacker\\.lookups package\n========================\n\nSubpackages\n-----------\n\n.. toctree::\n\n    stacker.lookups.handlers\n\n"
  },
  {
    "path": "docs/api/stacker.providers.aws.rst",
    "chars": 402,
    "preview": "stacker\\.providers\\.aws package\n===============================\n\nSubmodules\n----------\n\nstacker\\.providers\\.aws\\.default"
  },
  {
    "path": "docs/api/stacker.providers.rst",
    "chars": 431,
    "preview": "stacker\\.providers package\n==========================\n\nSubpackages\n-----------\n\n.. toctree::\n\n    stacker.providers.aws\n"
  },
  {
    "path": "docs/api/stacker.rst",
    "chars": 1833,
    "preview": "stacker package\n===============\n\nSubpackages\n-----------\n\n.. toctree::\n\n    stacker.actions\n    stacker.blueprints\n    s"
  },
  {
    "path": "docs/blueprints.rst",
    "chars": 16954,
    "preview": "==========\nBlueprints\n==========\n\nBlueprints are python classes that dynamically build CloudFormation templates. Where\ny"
  },
  {
    "path": "docs/commands.rst",
    "chars": 12599,
    "preview": "========\nCommands\n========\n\nBuild\n-----\n\nBuild is used to create/update the stacks provided in the config file. It\nautom"
  },
  {
    "path": "docs/conf.py",
    "chars": 9573,
    "preview": "# -*- coding: utf-8 -*-\n#\n# stacker documentation build configuration file, created by\n# sphinx-quickstart on Fri Aug 14"
  },
  {
    "path": "docs/config.rst",
    "chars": 25295,
    "preview": "=============\nConfiguration\n=============\n\nstacker makes use of a YAML formatted config file to define the different\nClo"
  },
  {
    "path": "docs/environments.rst",
    "chars": 3743,
    "preview": "============\nEnvironments\n============\n\nWhen running stacker, you can optionally provide an \"environment\" file. The\nenvi"
  },
  {
    "path": "docs/index.rst",
    "chars": 2391,
    "preview": ".. stacker documentation master file, created by\n   sphinx-quickstart on Fri Aug 14 09:59:29 2015.\n   You can adapt this"
  },
  {
    "path": "docs/lookups.rst",
    "chars": 15440,
    "preview": "=======\nLookups\n=======\n\nStacker provides the ability to dynamically replace values in the config via a\nconcept called l"
  },
  {
    "path": "docs/organizations_using_stacker.rst",
    "chars": 1917,
    "preview": "===========================\nOrganizations using stacker\n===========================\n\nBelow is a list of organizations th"
  },
  {
    "path": "docs/templates.rst",
    "chars": 698,
    "preview": "==========\nTemplates\n==========\n\nCloudFormation templates can be provided via python Blueprints_ or JSON/YAML.\nJSON/YAML"
  },
  {
    "path": "docs/terminology.rst",
    "chars": 2171,
    "preview": "===========\nTerminology\n===========\n\nblueprint\n=========\n\n.. _blueprints:\n\nA python class that is responsible for creati"
  },
  {
    "path": "docs/translators.rst",
    "chars": 1684,
    "preview": "===========\nTranslators\n===========\n\n.. note::\n  Translators have been deprecated in favor of `Lookups <lookups.html>`_\n"
  },
  {
    "path": "examples/cross-account/.aws/config",
    "chars": 571,
    "preview": "# The master account is like the root of our AWS account tree. It's the\n# entrypoint for all other profiles to sts.Assum"
  },
  {
    "path": "examples/cross-account/README.md",
    "chars": 4049,
    "preview": "This is a secure example setup to support cross-account provisioning of stacks with stacker. It:\n\n1. Sets up an appropri"
  },
  {
    "path": "examples/cross-account/stacker.yaml",
    "chars": 721,
    "preview": "---\nnamespace: ''\n\n# We'll set this to an empty string until we've provisioned the\n# \"stacker-bucket\" stack below.\nstack"
  },
  {
    "path": "examples/cross-account/templates/stacker-bucket.yaml",
    "chars": 864,
    "preview": "---\nAWSTemplateFormatVersion: \"2010-09-09\"\nDescription: A bucket for stacker to store CloudFormation templates\nParameter"
  },
  {
    "path": "examples/cross-account/templates/stacker-role.yaml",
    "chars": 1879,
    "preview": "---\nAWSTemplateFormatVersion: \"2010-09-09\"\nDescription: A role that stacker can assume\nParameters:\n  MasterAccountId:\n  "
  },
  {
    "path": "requirements.in",
    "chars": 222,
    "preview": "troposphere>=3.0.0\nbotocore>=1.12.111\nboto3>=1.9.111,<2.0\nPyYAML>=3.13b1\nawacs>=0.6.0\ngitpython>=3.0\njinja2>=2.7\nschemat"
  },
  {
    "path": "scripts/compare_env",
    "chars": 1896,
    "preview": "#!/usr/bin/env python\n\"\"\" A script to compare environment files. \"\"\"\n\nimport argparse\nimport os.path\n\nfrom stacker.envir"
  },
  {
    "path": "scripts/docker-stacker",
    "chars": 216,
    "preview": "#!/bin/bash\n\n# This script is meant to be used from within the Docker image for stacker. It\n# simply installs the stacks"
  },
  {
    "path": "scripts/stacker",
    "chars": 259,
    "preview": "#!/usr/bin/env python\n\nfrom stacker.logger import setup_logging\nfrom stacker.commands import Stacker\n\nif __name__ == \"__"
  },
  {
    "path": "scripts/stacker.cmd",
    "chars": 1057,
    "preview": "@echo OFF\nREM=\"\"\"\nsetlocal\nset PythonExe=\"\"\nset PythonExeFlags=\n\nfor %%i in (cmd bat exe) do (\n    for %%j in (python.%%"
  },
  {
    "path": "setup.cfg",
    "chars": 167,
    "preview": "[metadata]\ndescription-file = README.rst\n\n[aliases]\ntest = pytest\n\n[tool:pytest]\ntestpaths = stacker/tests\ncov = stacker"
  },
  {
    "path": "setup.py",
    "chars": 1710,
    "preview": "import os\nfrom setuptools import setup, find_packages\n\nVERSION = \"1.7.2\"\n\nsrc_dir = os.path.dirname(__file__)\n\ndef get_i"
  },
  {
    "path": "stacker/__init__.py",
    "chars": 23,
    "preview": "\n__version__ = \"1.7.2\"\n"
  },
  {
    "path": "stacker/actions/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/actions/base.py",
    "chars": 7962,
    "preview": "import os\nimport sys\nimport logging\nimport threading\n\nfrom ..dag import walk, ThreadedWalker, UnlimitedSemaphore\nfrom .."
  },
  {
    "path": "stacker/actions/build.py",
    "chars": 15767,
    "preview": "import logging\n\nfrom .base import BaseAction, plan, build_walker\nfrom .base import STACK_POLL_TIME\n\nfrom ..providers.bas"
  },
  {
    "path": "stacker/actions/destroy.py",
    "chars": 4023,
    "preview": "import logging\n\nfrom .base import BaseAction, plan, build_walker\nfrom .base import STACK_POLL_TIME\nfrom ..exceptions imp"
  },
  {
    "path": "stacker/actions/diff.py",
    "chars": 5930,
    "preview": "import logging\nfrom operator import attrgetter\n\nfrom .base import plan, build_walker\nfrom . import build\nfrom .. import "
  },
  {
    "path": "stacker/actions/graph.py",
    "chars": 1822,
    "preview": "import logging\nimport sys\nimport json\n\nfrom .base import BaseAction, plan\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef e"
  },
  {
    "path": "stacker/actions/info.py",
    "chars": 1134,
    "preview": "import logging\n\nfrom .base import BaseAction\nfrom .. import exceptions\n\nlogger = logging.getLogger(__name__)\n\n\nclass Act"
  },
  {
    "path": "stacker/awscli_yamlhelper.py",
    "chars": 2482,
    "preview": "# Copyright 2012-2015 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n#\n# Licensed under the Apache License, Ve"
  },
  {
    "path": "stacker/blueprints/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/blueprints/base.py",
    "chars": 18725,
    "preview": "from past.builtins import basestring\nimport copy\nimport hashlib\nimport logging\nimport string\nfrom stacker.util import re"
  },
  {
    "path": "stacker/blueprints/raw.py",
    "chars": 7600,
    "preview": "\"\"\"Blueprint representing raw template module.\"\"\"\n\nimport hashlib\nimport json\nimport os\nimport sys\n\nfrom jinja2 import T"
  },
  {
    "path": "stacker/blueprints/testutil.py",
    "chars": 5466,
    "preview": "import difflib\nimport json\nimport unittest\nimport os.path\nfrom glob import glob\n\nfrom stacker.config import parse as par"
  },
  {
    "path": "stacker/blueprints/variables/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/blueprints/variables/types.py",
    "chars": 8785,
    "preview": "\n\nclass TroposphereType(object):\n\n    def __init__(self, defined_type, many=False, optional=False,\n                 vali"
  },
  {
    "path": "stacker/commands/__init__.py",
    "chars": 37,
    "preview": "from .stacker import Stacker  # NOQA\n"
  },
  {
    "path": "stacker/commands/stacker/__init__.py",
    "chars": 1737,
    "preview": "import logging\n\nfrom .build import Build\nfrom .destroy import Destroy\nfrom .info import Info\nfrom .diff import Diff\nfrom"
  },
  {
    "path": "stacker/commands/stacker/base.py",
    "chars": 7802,
    "preview": "import argparse\nimport threading\nimport signal\nfrom collections.abc import Mapping\nimport logging\nimport os.path\n\nfrom ."
  },
  {
    "path": "stacker/commands/stacker/build.py",
    "chars": 2840,
    "preview": "\"\"\"Launches or updates CloudFormation stacks based on the given config.\n\nStacker is smart enough to figure out if anythi"
  },
  {
    "path": "stacker/commands/stacker/destroy.py",
    "chars": 2175,
    "preview": "\"\"\"Destroys CloudFormation stacks based on the given config.\n\nStacker will determine the order in which stacks should be"
  },
  {
    "path": "stacker/commands/stacker/diff.py",
    "chars": 1525,
    "preview": "\"\"\" Diffs the config against the currently running CloudFormation stacks\n\nSometimes small changes can have big impacts. "
  },
  {
    "path": "stacker/commands/stacker/graph.py",
    "chars": 1190,
    "preview": "\"\"\"Prints the the relationships between steps as a graph.\n\n\"\"\"\n\nfrom .base import BaseCommand\nfrom ...actions import gra"
  },
  {
    "path": "stacker/commands/stacker/info.py",
    "chars": 1027,
    "preview": "\"\"\"Gets information on the CloudFormation stacks based on the given config.\"\"\"\n\nfrom .base import BaseCommand\nfrom ...ac"
  },
  {
    "path": "stacker/config/__init__.py",
    "chars": 19013,
    "preview": "from past.types import basestring\nimport copy\nimport sys\nimport logging\nimport re\n\nfrom string import Template\nfrom io i"
  },
  {
    "path": "stacker/config/translators/__init__.py",
    "chars": 107,
    "preview": "import yaml\n\nfrom .kms import kms_simple_constructor\n\nyaml.add_constructor('!kms', kms_simple_constructor)\n"
  },
  {
    "path": "stacker/config/translators/kms.py",
    "chars": 240,
    "preview": "# NOTE: The translator is going to be deprecated in favor of the lookup\nfrom ...lookups.handlers.kms import KmsLookup\n\n\n"
  },
  {
    "path": "stacker/context.py",
    "chars": 6595,
    "preview": "import collections.abc\nimport logging\n\nfrom stacker.config import Config\nfrom .stack import Stack\nfrom .target import Ta"
  },
  {
    "path": "stacker/dag/__init__.py",
    "chars": 14943,
    "preview": "import logging\nfrom threading import Thread\nfrom copy import copy, deepcopy\nimport collections.abc\nfrom collections impo"
  },
  {
    "path": "stacker/environment.py",
    "chars": 1434,
    "preview": "\nimport yaml\n\n\nclass DictWithSourceType(dict):\n    \"\"\"An environment dict which keeps track of its source.\n\n    Environm"
  },
  {
    "path": "stacker/exceptions.py",
    "chars": 9326,
    "preview": "\n\nclass InvalidConfig(Exception):\n    def __init__(self, errors):\n        super(InvalidConfig, self).__init__(errors)\n  "
  },
  {
    "path": "stacker/hooks/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/hooks/aws_lambda.py",
    "chars": 21818,
    "preview": "from past.builtins import basestring\nimport os\nimport os.path\nimport stat\nimport logging\nimport hashlib\nfrom io import B"
  },
  {
    "path": "stacker/hooks/command.py",
    "chars": 3907,
    "preview": "\nimport logging\nimport os\nfrom subprocess import PIPE, Popen\n\nfrom stacker.exceptions import ImproperlyConfigured\n\nlogge"
  },
  {
    "path": "stacker/hooks/ecs.py",
    "chars": 1287,
    "preview": "# A lot of this code exists to deal w/ the broken ECS connect_to_region\n# function, and will be removed once this pull r"
  },
  {
    "path": "stacker/hooks/iam.py",
    "chars": 4801,
    "preview": "import copy\nimport logging\n\nfrom stacker.session_cache import get_session\nfrom botocore.exceptions import ClientError\n\nf"
  },
  {
    "path": "stacker/hooks/keypair.py",
    "chars": 8150,
    "preview": "\nimport logging\nimport os\nimport sys\n\nfrom botocore.exceptions import ClientError\n\nfrom stacker.session_cache import get"
  },
  {
    "path": "stacker/hooks/route53.py",
    "chars": 813,
    "preview": "import logging\n\nfrom stacker.session_cache import get_session\n\nfrom stacker.util import create_route53_zone\n\nlogger = lo"
  },
  {
    "path": "stacker/hooks/utils.py",
    "chars": 2793,
    "preview": "import os\nimport sys\nimport collections.abc\nimport logging\n\nfrom stacker.util import load_object_from_string\n\nlogger = l"
  },
  {
    "path": "stacker/logger/__init__.py",
    "chars": 1523,
    "preview": "import sys\nimport logging\n\nDEBUG_FORMAT = (\"[%(asctime)s] %(levelname)s %(threadName)s \"\n                \"%(name)s:%(lin"
  },
  {
    "path": "stacker/lookups/__init__.py",
    "chars": 2218,
    "preview": "from past.builtins import basestring\nfrom collections import namedtuple\nimport re\n\n# export resolve_lookups at this leve"
  },
  {
    "path": "stacker/lookups/handlers/__init__.py",
    "chars": 867,
    "preview": "\n\nclass LookupHandler(object):\n    @classmethod\n    def handle(cls, value, context, provider):\n        \"\"\"\n        Perfo"
  },
  {
    "path": "stacker/lookups/handlers/ami.py",
    "chars": 3164,
    "preview": "from stacker.session_cache import get_session\nimport re\nimport operator\n\nfrom . import LookupHandler\nfrom ...util import"
  },
  {
    "path": "stacker/lookups/handlers/default.py",
    "chars": 1107,
    "preview": "\nfrom . import LookupHandler\n\n\nTYPE_NAME = \"default\"\n\n\nclass DefaultLookup(LookupHandler):\n    @classmethod\n    def hand"
  },
  {
    "path": "stacker/lookups/handlers/dynamodb.py",
    "chars": 6629,
    "preview": "from botocore.exceptions import ClientError\nimport re\nfrom stacker.session_cache import get_session\n\nfrom . import Looku"
  },
  {
    "path": "stacker/lookups/handlers/envvar.py",
    "chars": 976,
    "preview": "import os\n\nfrom . import LookupHandler\nfrom ...util import read_value_from_path\n\nTYPE_NAME = \"envvar\"\n\n\nclass EnvvarLook"
  },
  {
    "path": "stacker/lookups/handlers/file.py",
    "chars": 7029,
    "preview": "\nimport base64\nimport json\nimport re\nfrom collections.abc import Mapping, Sequence\n\nimport yaml\n\nfrom troposphere import"
  },
  {
    "path": "stacker/lookups/handlers/hook_data.py",
    "chars": 577,
    "preview": "\nfrom . import LookupHandler\n\n\nTYPE_NAME = \"hook_data\"\n\n\nclass HookDataLookup(LookupHandler):\n    @classmethod\n    def h"
  },
  {
    "path": "stacker/lookups/handlers/kms.py",
    "chars": 2300,
    "preview": "import codecs\nimport sys\nfrom stacker.session_cache import get_session\n\nfrom . import LookupHandler\nfrom ...util import "
  },
  {
    "path": "stacker/lookups/handlers/output.py",
    "chars": 1944,
    "preview": "\nimport re\nfrom collections import namedtuple\n\nfrom . import LookupHandler\n\nTYPE_NAME = \"output\"\n\nOutput = namedtuple(\"O"
  },
  {
    "path": "stacker/lookups/handlers/rxref.py",
    "chars": 1458,
    "preview": "\"\"\"Handler for fetching outputs from fully qualified stacks.\n\nThe `output` handler supports fetching outputs from stacks"
  },
  {
    "path": "stacker/lookups/handlers/split.py",
    "chars": 1164,
    "preview": "from . import LookupHandler\nTYPE_NAME = \"split\"\n\n\nclass SplitLookup(LookupHandler):\n    @classmethod\n    def handle(cls,"
  },
  {
    "path": "stacker/lookups/handlers/ssmstore.py",
    "chars": 1593,
    "preview": "\nfrom stacker.session_cache import get_session\n\nfrom . import LookupHandler\nfrom ...util import read_value_from_path\n\nTY"
  },
  {
    "path": "stacker/lookups/handlers/xref.py",
    "chars": 1254,
    "preview": "\"\"\"Handler for fetching outputs from fully qualified stacks.\n\nThe `output` handler supports fetching outputs from stacks"
  },
  {
    "path": "stacker/lookups/registry.py",
    "chars": 3588,
    "preview": "\nimport logging\nimport warnings\n\nfrom past.builtins import basestring\n\nfrom ..exceptions import UnknownLookupType, Faile"
  },
  {
    "path": "stacker/plan.py",
    "chars": 11740,
    "preview": "import os\nimport logging\nimport time\nimport uuid\nimport threading\n\nfrom .util import stack_template_key_name\nfrom .excep"
  },
  {
    "path": "stacker/providers/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/providers/aws/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/providers/aws/default.py",
    "chars": 48203,
    "preview": "import json\nimport yaml\nimport logging\nimport time\nimport urllib.parse\nimport sys\n\n# thread safe, memoized, provider bui"
  },
  {
    "path": "stacker/providers/base.py",
    "chars": 1636,
    "preview": "\n\ndef not_implemented(method):\n    raise NotImplementedError(\"Provider does not support '%s' \"\n                         "
  },
  {
    "path": "stacker/session_cache.py",
    "chars": 1186,
    "preview": "import boto3\nimport logging\nfrom .ui import ui\n\n\nlogger = logging.getLogger(__name__)\n\n\n# A global credential cache that"
  },
  {
    "path": "stacker/stack.py",
    "chars": 6821,
    "preview": "import copy\n\nfrom . import util\nfrom .variables import (\n    Variable,\n    resolve_variables,\n)\n\nfrom .blueprints.raw im"
  },
  {
    "path": "stacker/status.py",
    "chars": 2044,
    "preview": "import operator\n\n\nclass Status(object):\n    def __init__(self, name, code, reason=None):\n        self.name = name\n      "
  },
  {
    "path": "stacker/target.py",
    "chars": 476,
    "preview": "\n\nclass Target(object):\n    \"\"\"A \"target\" is just a node in the stacker graph that does nothing, except\n    specify depe"
  },
  {
    "path": "stacker/tests/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/tests/actions/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/tests/actions/test_base.py",
    "chars": 4412,
    "preview": "\nimport unittest\n\nimport mock\n\nimport botocore.exceptions\nfrom botocore.stub import Stubber, ANY\n\nfrom stacker.actions.b"
  },
  {
    "path": "stacker/tests/actions/test_build.py",
    "chars": 15728,
    "preview": "import unittest\nfrom collections import namedtuple\n\nimport mock\n\nfrom stacker import exceptions\nfrom stacker.actions imp"
  },
  {
    "path": "stacker/tests/actions/test_destroy.py",
    "chars": 4481,
    "preview": "import unittest\n\nimport mock\n\nfrom stacker.actions import destroy\nfrom stacker.context import Context, Config\nfrom stack"
  },
  {
    "path": "stacker/tests/actions/test_diff.py",
    "chars": 2817,
    "preview": "import unittest\n\nfrom operator import attrgetter\nfrom stacker.actions.diff import (\n    diff_dictionaries,\n    diff_para"
  },
  {
    "path": "stacker/tests/blueprints/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/tests/blueprints/test_base.py",
    "chars": 27480,
    "preview": "import unittest\nimport sys\nfrom mock import patch\n\nfrom mock import MagicMock\nfrom troposphere import (\n    Base64,\n    "
  },
  {
    "path": "stacker/tests/blueprints/test_raw.py",
    "chars": 7457,
    "preview": "\"\"\"Test module for blueprint-from-raw-template module.\"\"\"\nimport json\nimport unittest\n\nfrom mock import MagicMock\n\nfrom "
  },
  {
    "path": "stacker/tests/blueprints/test_testutil.py",
    "chars": 1580,
    "preview": "import unittest\n\nfrom troposphere import ecr\n\nfrom ...blueprints.testutil import BlueprintTestCase\nfrom ...blueprints.ba"
  },
  {
    "path": "stacker/tests/conftest.py",
    "chars": 1165,
    "preview": "\nimport logging\nimport os\n\nimport pytest\nimport py.path\n\nlogger = logging.getLogger(__name__)\n\n\n@pytest.fixture(scope='s"
  },
  {
    "path": "stacker/tests/factories.py",
    "chars": 2324,
    "preview": "from mock import MagicMock\n\nfrom stacker.context import Context\nfrom stacker.config import Config, Stack\nfrom stacker.lo"
  },
  {
    "path": "stacker/tests/fixtures/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/tests/fixtures/basic.env",
    "chars": 24,
    "preview": "namespace: test.stacker\n"
  },
  {
    "path": "stacker/tests/fixtures/cfn_template.json",
    "chars": 521,
    "preview": "{\n    \"AWSTemplateFormatVersion\": \"2010-09-09\",\n    \"Description\": \"TestTemplate\",\n    \"Parameters\": {\n        \"Param1\":"
  },
  {
    "path": "stacker/tests/fixtures/cfn_template.json.j2",
    "chars": 540,
    "preview": "{\n    \"AWSTemplateFormatVersion\": \"2010-09-09\",\n    \"Description\": \"TestTemplate\",\n    \"Parameters\": {\n        \"Param1\":"
  },
  {
    "path": "stacker/tests/fixtures/cfn_template.yaml",
    "chars": 446,
    "preview": "AWSTemplateFormatVersion: \"2010-09-09\"\nDescription: TestTemplate\nParameters:\n  Param1:\n    Type: String\n  Param2:\n    De"
  },
  {
    "path": "stacker/tests/fixtures/keypair/fingerprint",
    "chars": 48,
    "preview": "d7:50:1f:78:55:5f:22:c1:f6:88:c6:5d:82:4f:94:4f\n"
  },
  {
    "path": "stacker/tests/fixtures/keypair/id_rsa",
    "chars": 1831,
    "preview": "-----BEGIN OPENSSH PRIVATE KEY-----\nb3BlbnNzaC1rZXktdjEAAAAABG5vbmUAAAAEbm9uZQAAAAAAAAABAAABFwAAAAdzc2gtcn\nNhAAAAAwEAAQA"
  },
  {
    "path": "stacker/tests/fixtures/keypair/id_rsa.pub",
    "chars": 381,
    "preview": "ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAklOUpkDHrfHY17SbrmTIpNLTGK9Tjom/BWDSUGPl+nafzlHDTYW7hdI4yZ5ew18JH4JW9jbhUFrviQzM7xlE"
  },
  {
    "path": "stacker/tests/fixtures/mock_blueprints.py",
    "chars": 15810,
    "preview": "from troposphere import GetAtt, Output, Sub, Ref\nfrom troposphere import iam\n\nfrom awacs.aws import Policy, Statement, A"
  },
  {
    "path": "stacker/tests/fixtures/mock_hooks.py",
    "chars": 85,
    "preview": "\n\ndef mock_hook(provider, context, **kwargs):\n    return {\"result\": kwargs[\"value\"]}\n"
  },
  {
    "path": "stacker/tests/fixtures/mock_lookups.py",
    "chars": 69,
    "preview": "TYPE_NAME = \"mock\"\n\n\ndef handler(value, **kwargs):\n    return \"mock\"\n"
  },
  {
    "path": "stacker/tests/fixtures/not-basic.env",
    "chars": 42,
    "preview": "namespace: test.stacker\nenvironment: test\n"
  },
  {
    "path": "stacker/tests/fixtures/parameter_resolution/template.yml",
    "chars": 438,
    "preview": "# used in functional test suites, to fix https://github.com/cloudtools/stacker/pull/615\nAWSTemplateFormatVersion: \"2010-"
  },
  {
    "path": "stacker/tests/fixtures/vpc-bastion-db-web-pre-1.0.yaml",
    "chars": 2558,
    "preview": "# Hooks require a path.\n# If the build should stop when a hook fails, set required to true.\n# pre_build happens before t"
  },
  {
    "path": "stacker/tests/fixtures/vpc-bastion-db-web.yaml",
    "chars": 2498,
    "preview": "# Hooks require a path.\n# If the build should stop when a hook fails, set required to true.\n# pre_build happens before t"
  },
  {
    "path": "stacker/tests/fixtures/vpc-custom-log-format-info.yaml",
    "chars": 749,
    "preview": "log_formats:\n  info: \"[%(asctime)s] ${environment} custom log format - %(message)s\"\n\nstacks:\n  - name: vpc\n    class_pat"
  },
  {
    "path": "stacker/tests/hooks/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/tests/hooks/test_aws_lambda.py",
    "chars": 13983,
    "preview": "import os.path\nimport os\nimport mock\nimport random\nfrom io import BytesIO as StringIO\nfrom zipfile import ZipFile\n\nimpor"
  },
  {
    "path": "stacker/tests/hooks/test_command.py",
    "chars": 5331,
    "preview": "\nimport os\nimport unittest\nfrom subprocess import PIPE\n\nimport mock\n\nfrom stacker.context import Context\nfrom stacker.co"
  },
  {
    "path": "stacker/tests/hooks/test_ecs.py",
    "chars": 3182,
    "preview": "import unittest\n\nimport boto3\nfrom moto import mock_ecs\nfrom testfixtures import LogCapture\n\nfrom stacker.hooks.ecs impo"
  },
  {
    "path": "stacker/tests/hooks/test_iam.py",
    "chars": 2790,
    "preview": "import unittest\n\nimport boto3\nfrom botocore.exceptions import ClientError\n\nfrom moto import mock_iam\n\nfrom stacker.hooks"
  },
  {
    "path": "stacker/tests/hooks/test_keypair.py",
    "chars": 6801,
    "preview": "import sys\nfrom collections import namedtuple\nfrom contextlib import contextmanager\n\nimport mock\nimport pytest\n\nimport b"
  },
  {
    "path": "stacker/tests/lookups/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/tests/lookups/handlers/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/tests/lookups/handlers/test_ami.py",
    "chars": 6818,
    "preview": "import unittest\nimport mock\nfrom botocore.stub import Stubber\nfrom stacker.lookups.handlers.ami import AmiLookup, ImageN"
  },
  {
    "path": "stacker/tests/lookups/handlers/test_default.py",
    "chars": 1113,
    "preview": "from mock import MagicMock\nimport unittest\n\nfrom stacker.context import Context\nfrom stacker.lookups.handlers.default im"
  },
  {
    "path": "stacker/tests/lookups/handlers/test_dynamodb.py",
    "chars": 7816,
    "preview": "import unittest\nimport mock\nfrom botocore.stub import Stubber\nfrom stacker.lookups.handlers.dynamodb import DynamodbLook"
  },
  {
    "path": "stacker/tests/lookups/handlers/test_envvar.py",
    "chars": 608,
    "preview": "import unittest\nfrom stacker.lookups.handlers.envvar import EnvvarLookup\nimport os\n\n\nclass TestEnvVarHandler(unittest.Te"
  },
  {
    "path": "stacker/tests/lookups/handlers/test_file.py",
    "chars": 7620,
    "preview": "# encoding: utf-8\n\n\nimport unittest\nimport mock\nimport base64\nimport yaml\nimport json\nfrom troposphere import Base64, Ge"
  },
  {
    "path": "stacker/tests/lookups/handlers/test_hook_data.py",
    "chars": 761,
    "preview": "import unittest\n\n\nfrom stacker.context import Context\nfrom stacker.lookups.handlers.hook_data import HookDataLookup\n\n\ncl"
  },
  {
    "path": "stacker/tests/lookups/handlers/test_output.py",
    "chars": 879,
    "preview": "from mock import MagicMock\nimport unittest\n\nfrom stacker.stack import Stack\nfrom ...factories import generate_definition"
  },
  {
    "path": "stacker/tests/lookups/handlers/test_rxref.py",
    "chars": 863,
    "preview": "from mock import MagicMock\nimport unittest\n\nfrom stacker.lookups.handlers.rxref import RxrefLookup\nfrom ....context impo"
  },
  {
    "path": "stacker/tests/lookups/handlers/test_split.py",
    "chars": 580,
    "preview": "import unittest\n\nfrom stacker.lookups.handlers.split import SplitLookup\n\n\nclass TestSplitLookup(unittest.TestCase):\n    "
  },
  {
    "path": "stacker/tests/lookups/handlers/test_ssmstore.py",
    "chars": 2590,
    "preview": "import unittest\nimport mock\nfrom botocore.stub import Stubber\nfrom stacker.lookups.handlers.ssmstore import SsmstoreLook"
  },
  {
    "path": "stacker/tests/lookups/handlers/test_xref.py",
    "chars": 796,
    "preview": "from mock import MagicMock\nimport unittest\n\nfrom stacker.lookups.handlers.xref import XrefLookup\n\n\nclass TestXrefHandler"
  },
  {
    "path": "stacker/tests/lookups/test_registry.py",
    "chars": 2470,
    "preview": "import unittest\n\nfrom mock import MagicMock\n\nfrom stacker.exceptions import (\n    UnknownLookupType,\n    FailedVariableL"
  },
  {
    "path": "stacker/tests/providers/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/tests/providers/aws/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "stacker/tests/providers/aws/test_default.py",
    "chars": 36619,
    "preview": "import copy\nfrom datetime import datetime\nimport os.path\nimport random\nimport string\nimport threading\nimport unittest\n\nf"
  },
  {
    "path": "stacker/tests/test_config.py",
    "chars": 20883,
    "preview": "import sys\nimport unittest\nimport yaml\n\nfrom stacker.config import (\n    render_parse_load,\n    load,\n    render,\n    pa"
  },
  {
    "path": "stacker/tests/test_context.py",
    "chars": 5795,
    "preview": "import unittest\n\nfrom stacker.context import Context, get_fqn\nfrom stacker.config import load, Config\nfrom stacker.hooks"
  },
  {
    "path": "stacker/tests/test_dag.py",
    "chars": 5992,
    "preview": "\"\"\" Tests on the DAG implementation \"\"\"\nimport threading\n\nimport pytest\n\nfrom stacker.dag import (\n    DAG,\n    DAGValid"
  },
  {
    "path": "stacker/tests/test_environment.py",
    "chars": 1185,
    "preview": "import unittest\n\nfrom stacker.environment import (\n    DictWithSourceType,\n    parse_environment\n)\n\ntest_env = \"\"\"key1: "
  },
  {
    "path": "stacker/tests/test_lookups.py",
    "chars": 4814,
    "preview": "import unittest\n\nfrom stacker.lookups import extract_lookups, extract_lookups_from_string\n\n\nclass TestLookupExtraction(u"
  },
  {
    "path": "stacker/tests/test_parse_user_data.py",
    "chars": 615,
    "preview": "import unittest\n\nimport yaml\n\nfrom ..tokenize_userdata import cf_tokenize\n\n\nclass TestCfTokenize(unittest.TestCase):\n   "
  },
  {
    "path": "stacker/tests/test_plan.py",
    "chars": 10663,
    "preview": "import os\nimport shutil\nimport tempfile\n\nimport unittest\nimport mock\n\nfrom stacker.context import Context, Config\nfrom s"
  },
  {
    "path": "stacker/tests/test_stack.py",
    "chars": 3628,
    "preview": "from mock import MagicMock\nimport unittest\n\nfrom stacker.lookups import register_lookup_handler\nfrom stacker.context imp"
  },
  {
    "path": "stacker/tests/test_stacker.py",
    "chars": 4258,
    "preview": "import unittest\n\nfrom stacker.commands import Stacker\nfrom stacker.exceptions import InvalidConfig\n\n\nclass TestStacker(u"
  },
  {
    "path": "stacker/tests/test_util.py",
    "chars": 15115,
    "preview": "\nimport unittest\n\nimport string\nimport os\nimport queue\n\nimport mock\n\nimport boto3\n\nfrom stacker.config import Hook, GitP"
  },
  {
    "path": "stacker/tests/test_variables.py",
    "chars": 6362,
    "preview": "\nimport unittest\n\nfrom mock import MagicMock\n\nfrom troposphere import s3\nfrom stacker.blueprints.variables.types import "
  },
  {
    "path": "stacker/tokenize_userdata.py",
    "chars": 1397,
    "preview": "import re\n\nfrom troposphere import Ref, GetAtt\n\n\nHELPERS = {\n    \"Ref\": Ref,\n    \"Fn::GetAtt\": GetAtt\n}\n\nsplit_string = "
  },
  {
    "path": "stacker/ui.py",
    "chars": 1702,
    "preview": "import threading\nimport logging\nfrom getpass import getpass\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef get_raw_input(m"
  },
  {
    "path": "stacker/util.py",
    "chars": 30410,
    "preview": "import copy\nimport uuid\nimport importlib\nimport logging\nimport os\nimport re\nimport shutil\nimport subprocess\nimport sys\ni"
  },
  {
    "path": "stacker/variables.py",
    "chars": 12127,
    "preview": "\nimport re\n\nfrom past.builtins import basestring\nfrom string import Template\n\nfrom .exceptions import InvalidLookupCombi"
  },
  {
    "path": "test-requirements.in",
    "chars": 104,
    "preview": "pytest~=6.0\npytest-cov~=2.6\nmock~=2.0\nmoto[awslambda,ec2]~=3.0.0\ntestfixtures~=6.18.3\nflake8\npep8-naming"
  },
  {
    "path": "tests/Makefile",
    "chars": 564,
    "preview": "permissions:\n\t./stacker.yaml.sh | stacker build -\n\ntest: permissions\n\t$(eval AWS_ACCESS_KEY_ID := $(shell ./stacker.yaml"
  },
  {
    "path": "tests/README.md",
    "chars": 834,
    "preview": "This directory contains the functional testing suite for stacker. It exercises all of stacker against a real AWS account"
  },
  {
    "path": "tests/cleanup_functional_test_buckets.sh",
    "chars": 482,
    "preview": "#!/usr/bin/env bash\n\nif [ -z \"$AWS_ACCESS_KEY_ID\" ]\nthen\n    echo \"AWS_ACCESS_KEY_ID not set, skipping bucket cleanup.\"\n"
  },
  {
    "path": "tests/fixtures/blueprints/test_repo.json",
    "chars": 363,
    "preview": "{\n    \"Resources\": {\n        \"repo1Repository\": {\n            \"Properties\": {\n                \"RepositoryName\": \"repo1\"\n"
  },
  {
    "path": "tests/fixtures/stack_policies/default.json",
    "chars": 142,
    "preview": "{\n  \"Statement\" : [\n    {\n      \"Effect\" : \"Allow\",\n      \"Action\" : \"Update:*\",\n      \"Principal\": \"*\",\n      \"Resource"
  },
  {
    "path": "tests/fixtures/stack_policies/none.json",
    "chars": 141,
    "preview": "{\n  \"Statement\" : [\n    {\n      \"Effect\" : \"Deny\",\n      \"Action\" : \"Update:*\",\n      \"Principal\": \"*\",\n      \"Resource\""
  },
  {
    "path": "tests/run_test_suite.sh",
    "chars": 226,
    "preview": "#!/bin/sh\n\nTEST_ARGS=$*\n\nif [ -z \"$TEST_ARGS\" ]\nthen\n    _TESTS=\"test_suite\"\nelse\n    for T in ${TEST_ARGS}\n    do\n     "
  },
  {
    "path": "tests/stacker.yaml.sh",
    "chars": 328,
    "preview": "#!/bin/bash\n\ncat - <<EOF\nnamespace: ${STACKER_NAMESPACE}\nstacker_bucket: '' # No need to upload to S3\nstacks:\n  - name: "
  },
  {
    "path": "tests/test_helper.bash",
    "chars": 1563,
    "preview": "#!/usr/bin/env bash\n\n# To make the tests run faster, we don't wait between calls to DescribeStacks\n# to check on the sta"
  },
  {
    "path": "tests/test_suite/01_stacker_build_no_config.bats",
    "chars": 209,
    "preview": "#!/usr/bin/env bats\n\nload ../test_helper\n\n@test \"stacker build - no config\" {\n  stacker build\n  assert ! \"$status\" -eq 0"
  },
  {
    "path": "tests/test_suite/02_stacker_build_empty_config.bats",
    "chars": 192,
    "preview": "#!/usr/bin/env bats\n#\nload ../test_helper\n\n@test \"stacker build - empty config\" {\n  stacker build <(echo \"\")\n  assert ! "
  },
  {
    "path": "tests/test_suite/03_stacker_build-config_with_no_stacks.bats",
    "chars": 257,
    "preview": "#!/usr/bin/env bats\n\nload ../test_helper\n\n@test \"stacker build - config with no stacks\" {\n  needs_aws\n\n  stacker build -"
  },
  {
    "path": "tests/test_suite/04_stacker_build-config_with_no_namespace.bats",
    "chars": 319,
    "preview": "#!/usr/bin/env bats\n\nload ../test_helper\n\n@test \"stacker build - config with no namespace\" {\n  stacker build - <<EOF\nsta"
  },
  {
    "path": "tests/test_suite/05_stacker_build-missing_environment_key.bats",
    "chars": 666,
    "preview": "#!/usr/bin/env bats\n\nload ../test_helper\n\n@test \"stacker build - missing environment key\" {\n  environment() {\n    cat <<"
  },
  {
    "path": "tests/test_suite/06_stacker_build-duplicate_stacks.bats",
    "chars": 376,
    "preview": "#!/usr/bin/env bats\n\nload ../test_helper\n\n@test \"stacker build - duplicate stacks\" {\n  stacker build - <<EOF\nnamespace: "
  },
  {
    "path": "tests/test_suite/07_stacker_graph-json_format.bats",
    "chars": 1152,
    "preview": "#!/usr/bin/env bats\n\nload ../test_helper\n\n@test \"stacker graph - json format\" {\n  config() {\n    cat <<EOF\nnamespace: ${"
  }
]

// ... and 26 more files (download for full content)

About this extraction

This page contains the full source code of the remind101/stacker GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 226 files (802.1 KB), approximately 188.4k tokens, and a symbol index with 1202 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!