Full Code of llllllllll/codetransformer for AI

master c5f551e915df cached
56 files
341.3 KB
86.0k tokens
524 symbols
1 requests
Download .txt
Showing preview only (360K chars total). Download the full file or copy to clipboard to get everything.
Repository: llllllllll/codetransformer
Branch: master
Commit: c5f551e915df
Files: 56
Total size: 341.3 KB

Directory structure:
gitextract_cgcyv2vr/

├── .coveragerc
├── .gitattributes
├── .gitignore
├── .travis.yml
├── LICENSE
├── MANIFEST.in
├── README.rst
├── codetransformer/
│   ├── __init__.py
│   ├── _version.py
│   ├── code.py
│   ├── core.py
│   ├── decompiler/
│   │   ├── _343.py
│   │   └── __init__.py
│   ├── instructions.py
│   ├── patterns.py
│   ├── tests/
│   │   ├── __init__.py
│   │   ├── test_code.py
│   │   ├── test_core.py
│   │   ├── test_decompiler.py
│   │   └── test_instructions.py
│   ├── transformers/
│   │   ├── __init__.py
│   │   ├── add2mul.py
│   │   ├── constants.py
│   │   ├── interpolated_strings.py
│   │   ├── literals.py
│   │   ├── pattern_matched_exceptions.py
│   │   ├── precomputed_slices.py
│   │   └── tests/
│   │       ├── __init__.py
│   │       ├── test_add2mul.py
│   │       ├── test_constants.py
│   │       ├── test_exc_patterns.py
│   │       ├── test_interpolated_strings.py
│   │       ├── test_literals.py
│   │       └── test_precomputed_slices.py
│   └── utils/
│       ├── __init__.py
│       ├── functional.py
│       ├── immutable.py
│       ├── instance.py
│       ├── no_default.py
│       ├── pretty.py
│       └── tests/
│           ├── __init__.py
│           ├── test_immutable.py
│           └── test_pretty.py
├── docs/
│   ├── .dir-locals.el
│   ├── Makefile
│   └── source/
│       ├── appendix.rst
│       ├── code-objects.rst
│       ├── conf.py
│       ├── index.rst
│       ├── magics.rst
│       └── patterns.rst
├── requirements_doc.txt
├── setup.cfg
├── setup.py
├── tox.ini
└── versioneer.py

================================================
FILE CONTENTS
================================================

================================================
FILE: .coveragerc
================================================
[run]
omit =
    codetransformer/_version.py


================================================
FILE: .gitattributes
================================================
codetransformer/_version.py export-subst


================================================
FILE: .gitignore
================================================
.bundle
db/*.sqlite3
log/*.log
*.log
tmp/**/*
tmp/*
*.swp
*~
#mac autosaving file
.DS_Store
*.py[co]

# Installer logs
pip-log.txt

# Unit test / coverage reports
.coverage
.tox
test.log
.noseids
*.xlsx

# Compiled python files
*.py[co]

# Packages
*.egg
*.egg-info
dist
build
eggs
cover
parts
bin
var
sdist
develop-eggs
.installed.cfg
coverage.xml
nosetests.xml

# C Extensions
*.o
*.so
*.out

# Vim
*.swp
*.swo

# Built documentation
docs/_build/*

# database of vbench
benchmarks.db

# Vagrant temp folder
.vagrant

# pypi
MANIFEST

# pytest
.cache

htmlcov


================================================
FILE: .travis.yml
================================================
language: python
sudo: false
python:
  - 3.4.3
  - 3.4
  - 3.5
  - 3.6

install:
  - pip install -e .[dev]

script:
  - py.test codetransformer
  - flake8 codetransformer

notifications:
  email: false


================================================
FILE: LICENSE
================================================
             GNU GENERAL PUBLIC LICENSE
                Version 2, June 1991

 Copyright (C) 1989, 1991 Free Software Foundation, Inc.,
 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
 Everyone is permitted to copy and distribute verbatim copies
 of this license document, but changing it is not allowed.

                     Preamble

  The licenses for most software are designed to take away your
freedom to share and change it.  By contrast, the GNU General Public
License is intended to guarantee your freedom to share and change free
software--to make sure the software is free for all its users.  This
General Public License applies to most of the Free Software
Foundation's software and to any other program whose authors commit to
using it.  (Some other Free Software Foundation software is covered by
the GNU Lesser General Public License instead.)  You can apply it to
your programs, too.

  When we speak of free software, we are referring to freedom, not
price.  Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
this service if you wish), that you receive source code or can get it
if you want it, that you can change the software or use pieces of it
in new free programs; and that you know you can do these things.

  To protect your rights, we need to make restrictions that forbid
anyone to deny you these rights or to ask you to surrender the rights.
These restrictions translate to certain responsibilities for you if you
distribute copies of the software, or if you modify it.

  For example, if you distribute copies of such a program, whether
gratis or for a fee, you must give the recipients all the rights that
you have.  You must make sure that they, too, receive or can get the
source code.  And you must show them these terms so they know their
rights.

  We protect your rights with two steps: (1) copyright the software, and
(2) offer you this license which gives you legal permission to copy,
distribute and/or modify the software.

  Also, for each author's protection and ours, we want to make certain
that everyone understands that there is no warranty for this free
software.  If the software is modified by someone else and passed on, we
want its recipients to know that what they have is not the original, so
that any problems introduced by others will not reflect on the original
authors' reputations.

  Finally, any free program is threatened constantly by software
patents.  We wish to avoid the danger that redistributors of a free
program will individually obtain patent licenses, in effect making the
program proprietary.  To prevent this, we have made it clear that any
patent must be licensed for everyone's free use or not licensed at all.

  The precise terms and conditions for copying, distribution and
modification follow.

             GNU GENERAL PUBLIC LICENSE
   TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION

  0. This License applies to any program or other work which contains
a notice placed by the copyright holder saying it may be distributed
under the terms of this General Public License.  The "Program", below,
refers to any such program or work, and a "work based on the Program"
means either the Program or any derivative work under copyright law:
that is to say, a work containing the Program or a portion of it,
either verbatim or with modifications and/or translated into another
language.  (Hereinafter, translation is included without limitation in
the term "modification".)  Each licensee is addressed as "you".

Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope.  The act of
running the Program is not restricted, and the output from the Program
is covered only if its contents constitute a work based on the
Program (independent of having been made by running the Program).
Whether that is true depends on what the Program does.

  1. You may copy and distribute verbatim copies of the Program's
source code as you receive it, in any medium, provided that you
conspicuously and appropriately publish on each copy an appropriate
copyright notice and disclaimer of warranty; keep intact all the
notices that refer to this License and to the absence of any warranty;
and give any other recipients of the Program a copy of this License
along with the Program.

You may charge a fee for the physical act of transferring a copy, and
you may at your option offer warranty protection in exchange for a fee.

  2. You may modify your copy or copies of the Program or any portion
of it, thus forming a work based on the Program, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:

    a) You must cause the modified files to carry prominent notices
    stating that you changed the files and the date of any change.

    b) You must cause any work that you distribute or publish, that in
    whole or in part contains or is derived from the Program or any
    part thereof, to be licensed as a whole at no charge to all third
    parties under the terms of this License.

    c) If the modified program normally reads commands interactively
    when run, you must cause it, when started running for such
    interactive use in the most ordinary way, to print or display an
    announcement including an appropriate copyright notice and a
    notice that there is no warranty (or else, saying that you provide
    a warranty) and that users may redistribute the program under
    these conditions, and telling the user how to view a copy of this
    License.  (Exception: if the Program itself is interactive but
    does not normally print such an announcement, your work based on
    the Program is not required to print an announcement.)

These requirements apply to the modified work as a whole.  If
identifiable sections of that work are not derived from the Program,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works.  But when you
distribute the same sections as part of a whole which is a work based
on the Program, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote it.

Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Program.

In addition, mere aggregation of another work not based on the Program
with the Program (or with a work based on the Program) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.

  3. You may copy and distribute the Program (or a work based on it,
under Section 2) in object code or executable form under the terms of
Sections 1 and 2 above provided that you also do one of the following:

    a) Accompany it with the complete corresponding machine-readable
    source code, which must be distributed under the terms of Sections
    1 and 2 above on a medium customarily used for software interchange; or,

    b) Accompany it with a written offer, valid for at least three
    years, to give any third party, for a charge no more than your
    cost of physically performing source distribution, a complete
    machine-readable copy of the corresponding source code, to be
    distributed under the terms of Sections 1 and 2 above on a medium
    customarily used for software interchange; or,

    c) Accompany it with the information you received as to the offer
    to distribute corresponding source code.  (This alternative is
    allowed only for noncommercial distribution and only if you
    received the program in object code or executable form with such
    an offer, in accord with Subsection b above.)

The source code for a work means the preferred form of the work for
making modifications to it.  For an executable work, complete source
code means all the source code for all modules it contains, plus any
associated interface definition files, plus the scripts used to
control compilation and installation of the executable.  However, as a
special exception, the source code distributed need not include
anything that is normally distributed (in either source or binary
form) with the major components (compiler, kernel, and so on) of the
operating system on which the executable runs, unless that component
itself accompanies the executable.

If distribution of executable or object code is made by offering
access to copy from a designated place, then offering equivalent
access to copy the source code from the same place counts as
distribution of the source code, even though third parties are not
compelled to copy the source along with the object code.

  4. You may not copy, modify, sublicense, or distribute the Program
except as expressly provided under this License.  Any attempt
otherwise to copy, modify, sublicense or distribute the Program is
void, and will automatically terminate your rights under this License.
However, parties who have received copies, or rights, from you under
this License will not have their licenses terminated so long as such
parties remain in full compliance.

  5. You are not required to accept this License, since you have not
signed it.  However, nothing else grants you permission to modify or
distribute the Program or its derivative works.  These actions are
prohibited by law if you do not accept this License.  Therefore, by
modifying or distributing the Program (or any work based on the
Program), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Program or works based on it.

  6. Each time you redistribute the Program (or any work based on the
Program), the recipient automatically receives a license from the
original licensor to copy, distribute or modify the Program subject to
these terms and conditions.  You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties to
this License.

  7. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License.  If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Program at all.  For example, if a patent
license would not permit royalty-free redistribution of the Program by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Program.

If any portion of this section is held invalid or unenforceable under
any particular circumstance, the balance of the section is intended to
apply and the section as a whole is intended to apply in other
circumstances.

It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system, which is
implemented by public license practices.  Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.

This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.

  8. If the distribution and/or use of the Program is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Program under this License
may add an explicit geographical distribution limitation excluding
those countries, so that distribution is permitted only in or among
countries not thus excluded.  In such case, this License incorporates
the limitation as if written in the body of this License.

  9. The Free Software Foundation may publish revised and/or new versions
of the General Public License from time to time.  Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.

Each version is given a distinguishing version number.  If the Program
specifies a version number of this License which applies to it and "any
later version", you have the option of following the terms and conditions
either of that version or of any later version published by the Free
Software Foundation.  If the Program does not specify a version number of
this License, you may choose any version ever published by the Free Software
Foundation.

  10. If you wish to incorporate parts of the Program into other free
programs whose distribution conditions are different, write to the author
to ask for permission.  For software which is copyrighted by the Free
Software Foundation, write to the Free Software Foundation; we sometimes
make exceptions for this.  Our decision will be guided by the two goals
of preserving the free status of all derivatives of our free software and
of promoting the sharing and reuse of software generally.

                     NO WARRANTY

  11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW.  EXCEPT WHEN
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE.  THE ENTIRE RISK AS
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU.  SHOULD THE
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
REPAIR OR CORRECTION.

  12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGES.

              END OF TERMS AND CONDITIONS

     How to Apply These Terms to Your New Programs

  If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.

  To do so, attach the following notices to the program.  It is safest
to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.

    <one line to give the program's name and a brief idea of what it does.>
    Copyright (C) <year>  <name of author>

    This program is free software; you can redistribute it and/or modify
    it under the terms of the GNU General Public License as published by
    the Free Software Foundation; either version 2 of the License, or
    (at your option) any later version.

    This program is distributed in the hope that it will be useful,
    but WITHOUT ANY WARRANTY; without even the implied warranty of
    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    GNU General Public License for more details.

    You should have received a copy of the GNU General Public License along
    with this program; if not, write to the Free Software Foundation, Inc.,
    51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.

Also add information on how to contact you by electronic and paper mail.

If the program is interactive, make it output a short notice like this
when it starts in an interactive mode:

    Gnomovision version 69, Copyright (C) year name of author
    Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
    This is free software, and you are welcome to redistribute it
    under certain conditions; type `show c' for details.

The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License.  Of course, the commands you use may
be called something other than `show w' and `show c'; they could even be
mouse-clicks or menu items--whatever suits your program.

You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the program, if
necessary.  Here is a sample; alter the names:

  Yoyodyne, Inc., hereby disclaims all copyright interest in the program
  `Gnomovision' (which makes passes at compilers) written by James Hacker.

  <signature of Ty Coon>, 1 April 1989
  Ty Coon, President of Vice

This General Public License does not permit incorporating your program into
proprietary programs.  If your program is a subroutine library, you may
consider it more useful to permit linking proprietary applications with the
library.  If this is what you want to do, use the GNU Lesser General
Public License instead of this License.


================================================
FILE: MANIFEST.in
================================================
include versioneer.py
include codetransformer/_version.py


================================================
FILE: README.rst
================================================
``codetransformer``
===================

|build status| |documentation|

Bytecode transformers for CPython inspired by the ``ast`` module's
``NodeTransformer``.

What is ``codetransformer``?
----------------------------

``codetransformer`` is a library that allows us to work with CPython's bytecode
representation at runtime. ``codetransformer`` provides a level of abstraction
between the programmer and the raw bytes read by the eval loop so that we can
more easily inspect and modify bytecode.

``codetransformer`` is motivated by the need to override parts of the python
language that are not already hooked into through data model methods. For example:

* Override the ``is`` and ``not`` operators.
* Custom data structure literals.
* Syntax features that cannot be represented with valid python AST or source.
* Run without a modified CPython interpreter.

``codetransformer`` was originally developed as part of lazy_ to implement
the transformations needed to override the code objects at runtime.

Example Uses
------------

Overloading Literals
~~~~~~~~~~~~~~~~~~~~

While this can be done as an AST transformation, we will often need to execute
the constructor for the literal multiple times. Also, we need to be sure that
any additional names required to run our code are provided when we run. With
``codetransformer``, we can pre compute our new literals and emit code that is
as fast as loading our unmodified literals without requiring any additional
names be available implicitly.

In the following block we demonstrate overloading dictionary syntax to result in
``collections.OrderedDict`` objects. ``OrderedDict`` is like a ``dict``;
however, the order of the keys is preserved.

.. code-block:: python

   >>> from codetransformer.transformers.literals import ordereddict_literals
   >>> @ordereddict_literals
   ... def f():
   ...     return {'a': 1, 'b': 2, 'c': 3}
   >>> f()
   OrderedDict([('a', 1), ('b', 2), ('c', 3)])

This also supports dictionary comprehensions:

.. code-block:: python

   >>> @ordereddict_literals
   ... def f():
   ...     return {k: v for k, v in zip('abc', (1, 2, 3))}
   >>> f()
   OrderedDict([('a', 1), ('b', 2), ('c', 3)])

The next block overrides ``float`` literals with ``decimal.Decimal``
objects. These objects support arbitrary precision arithmetic.

.. code-block:: python

   >>> from codetransformer.transformers.literals import decimal_literals
   >>> @decimal_literals
   ... def f():
   ...     return 1.5
   >>> f()
   Decimal('1.5')

Pattern Matched Exceptions
~~~~~~~~~~~~~~~~~~~~~~~~~~

Pattern matched exceptions are a good example of a ``CodeTransformer`` that
would be very complicated to implement at the AST level. This transformation
extends the ``try/except`` syntax to accept instances of ``BaseException`` as
well subclasses of ``BaseException``. When excepting an instance, the ``args``
of the exception will be compared for equality to determine which exception
handler should be invoked. For example:

.. code-block:: python

   >>> @pattern_matched_exceptions()
   ... def foo():
   ...     try:
   ...         raise ValueError('bar')
   ...     except ValueError('buzz'):
   ...         return 'buzz'
   ...     except ValueError('bar'):
   ...         return 'bar'
   >>> foo()
   'bar'

This function raises an instance of ``ValueError`` and attempts to catch it. The
first check looks for instances of ``ValueError`` that were constructed with an
argument of ``'buzz'``. Because our custom exception is raised with ``'bar'``,
these are not equal and we do not enter this handler. The next handler looks for
``ValueError('bar')`` which does match the exception we raised. We then enter
this block and normal python rules take over.

We may also pass their own exception matching function:

.. code-block:: python

    >>> def match_greater(match_expr, exc_type, exc_value, exc_traceback):
    ...     return math_expr > exc_value.args[0]

    >>> @pattern_matched_exceptions(match_greater)
    ... def foo():
    ...     try:
    ...         raise ValueError(5)
    ...     except 4:
    ...         return 4
    ...     except 5:
    ...         return 5
    ...     except 6:
    ...         return 6
    >>> foo()
    6

This matches on when the match expression is greater in value than the first
argument of any exception type that is raised. This particular behavior would be
very hard to mimic through AST level transformations.

Core Abstractions
-----------------

The three core abstractions of ``codetransformer`` are:

1. The ``Instruction`` object which represents an opcode_ which may be paired
   with some argument.
2. The ``Code`` object which represents a collection of ``Instruction``\s.
3. The ``CodeTransformer`` object which represents a set of rules for
   manipulating ``Code`` objects.

Instructions
~~~~~~~~~~~~

The ``Instruction`` object represents an atomic operation that can be performed
by the CPython virtual machine. These are things like ``LOAD_NAME`` which loads
a name onto the stack, or ``ROT_TWO`` which rotates the top two stack elements.

Some instructions accept an argument, for example ``LOAD_NAME``, which modifies
the behavior of the instruction. This is much like a function call where some
functions accept arguments. Because the bytecode is always packed as raw bytes,
the argument must be some integer (CPython stores all arguments two in bytes).
This means that things that need a more rich argument system (like ``LOAD_NAME``
which needs the actual name to look up) must carry around the actual arguments
in some table and use the integer as an offset into this array. One of the key
abstractions of the ``Instruction`` object is that the argument is always some
python object that represents the actual argument. Any lookup table management
is handled for the user. This is helpful because some arguments share this table
so we don't want to add extra entries or forget to add them at all.

Another annoyance is that the instructions that handle control flow use their
argument to say what bytecode offset to jump to. Some jumps use the absolute
index, others use a relative index. This also makes it hard if you want to add
or remove instructions because all of the offsets must be recomputed. In
``codetransformer``, the jump instructions all accept another ``Instruction`` as
the argument so that the assembler can manage this for the user. We also provide
an easy way for new instructions to "steal" jumps that targeted another
instruction so that can manage altering the bytecode around jump targets.

Code
~~~~

``Code`` objects are a nice abstraction over python's
``types.CodeType``. Quoting the ``CodeType`` constructor docstring:

::

   code(argcount, kwonlyargcount, nlocals, stacksize, flags, codestring,
         constants, names, varnames, filename, name, firstlineno,
         lnotab[, freevars[, cellvars]])

   Create a code object.  Not for the faint of heart.

The ``codetransformer`` abstraction is designed to make it easy to dynamically
construct and inspect these objects. This allows us to easy set things like the
argument names, and manipulate the line number mappings.

The ``Code`` object provides methods for converting to and from Python's code
representation:

1. ``from_pycode``
2. ``to_pycode``.

This allows us to take an existing function, parse the meaning from it, modify
it, and then assemble this back into a new python code object.

.. note::

   ``Code`` objects are immutable. When we say "modify", we mean create a copy
   with different values.

CodeTransformers
----------------

This is the set of rules that are used to actually modify the ``Code``
objects. These rules are defined as a set of ``patterns`` which are a DSL used
to define a DFA for matching against sequences of ``Instruction`` objects. Once
we have matched a segment, we yield new instructions to replace what we have
matched. A simple codetransformer looks like:

.. code-block:: python

   from codetransformer import CodeTransformer, instructions

   class FoldNames(CodeTransformer):
       @pattern(
           instructions.LOAD_GLOBAL,
           instructions.LOAD_GLOBAL,
           instructions.BINARY_ADD,
       )
       def _load_fast(self, a, b, add):
           yield instructions.LOAD_FAST(a.arg + b.arg).steal(a)

This ``CodeTransformer`` uses the ``+`` operator to implement something like
``CPP``\s token pasting for local variables. We read this pattern as a sequence
of two ``LOAD_GLOBAL`` (global name lookups) followed by a ``BINARY_ADD``
instruction (``+`` operator call). This will then call the function with the
three instructions passed positionally. This handler replaces this sequence with
a single instruction that emits a ``LOAD_FAST`` (local name lookup) that is the
result of adding the two names together. We then steal any jumps that used to
target the first ``LOAD_GLOBAL``.

We can execute this transformer by calling an instance of it on a
function object, or using it like a decorator. For example:

.. code-block:: python

   >>> @FoldNames()
   ... def f():
   ...     ab = 3
   ...     return a + b
   >>> f()
   3


License
-------

``codetransformer`` is free software, licensed under the GNU General Public
License, version 2. For more information see the ``LICENSE`` file.


Source
------

Source code is hosted on github at
https://github.com/llllllllll/codetransformer.


.. _lazy: https://github.com/llllllllll/lazy_python
.. _opcode: https://docs.python.org/3.5/library/dis.html#opcode-NOP
.. |build status| image:: https://travis-ci.org/llllllllll/codetransformer.svg?branch=master
   :target: https://travis-ci.org/llllllllll/codetransformer
.. |documentation| image:: https://readthedocs.org/projects/codetransformer/badge/?version=stable
   :target: http://codetransformer.readthedocs.io/en/stable/?badge=stable
   :alt: Documentation Status


================================================
FILE: codetransformer/__init__.py
================================================
from .code import Code, Flag
from .core import CodeTransformer
from . patterns import (
    matchany,
    not_,
    option,
    or_,
    pattern,
    plus,
    seq,
    var,
)
from . import instructions
from . import transformers
from .utils.pretty import a, d, display, pprint_ast, pformat_ast
from ._version import get_versions


__version__ = get_versions()['version']
del get_versions


def load_ipython_extension(ipython):  # pragma: no cover

    def dis_magic(line, cell=None):
        if cell is None:
            return d(line)
        return d(cell)
    ipython.register_magic_function(dis_magic, 'line_cell', 'dis')

    def ast_magic(line, cell=None):
        if cell is None:
            return a(line)
        return a(cell)
    ipython.register_magic_function(ast_magic, 'line_cell', 'ast')


__all__ = [
    'a',
    'd',
    'display',
    'Code',
    'CodeTransformer',
    'Flag',
    'instructions',
    'matchany',
    'not_',
    'option',
    'or_',
    'pattern',
    'pattern',
    'plus',
    'pformat_ast',
    'pprint_ast',
    'seq',
    'var',
    'transformers',
]


================================================
FILE: codetransformer/_version.py
================================================

# This file helps to compute a version number in source trees obtained from
# git-archive tarball (such as those provided by githubs download-from-tag
# feature). Distribution tarballs (built by setup.py sdist) and build
# directories (produced by setup.py build) will contain a much shorter file
# that just contains the computed version number.

# This file is released into the public domain. Generated by
# versioneer-0.15 (https://github.com/warner/python-versioneer)

import errno
import os
import re
import subprocess
import sys


def get_keywords():
    # these strings will be replaced by git during git-archive.
    # setup.py/versioneer.py will grep for the variable names, so they must
    # each be defined on a line of their own. _version.py will just call
    # get_keywords().
    git_refnames = "$Format:%d$"
    git_full = "$Format:%H$"
    keywords = {"refnames": git_refnames, "full": git_full}
    return keywords


class VersioneerConfig:
    pass


def get_config():
    # these strings are filled in when 'setup.py versioneer' creates
    # _version.py
    cfg = VersioneerConfig()
    cfg.VCS = "git"
    cfg.style = "pep440"
    cfg.tag_prefix = ""
    cfg.parentdir_prefix = "codetransformer-"
    cfg.versionfile_source = "codetransformer/_version.py"
    cfg.verbose = False
    return cfg


class NotThisMethod(Exception):
    pass


LONG_VERSION_PY = {}
HANDLERS = {}


def register_vcs_handler(vcs, method):  # decorator
    def decorate(f):
        if vcs not in HANDLERS:
            HANDLERS[vcs] = {}
        HANDLERS[vcs][method] = f
        return f
    return decorate


def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False):
    assert isinstance(commands, list)
    p = None
    for c in commands:
        try:
            dispcmd = str([c] + args)
            # remember shell=False, so use git.cmd on windows, not just git
            p = subprocess.Popen([c] + args, cwd=cwd, stdout=subprocess.PIPE,
                                 stderr=(subprocess.PIPE if hide_stderr
                                         else None))
            break
        except EnvironmentError:
            e = sys.exc_info()[1]
            if e.errno == errno.ENOENT:
                continue
            if verbose:
                print("unable to run %s" % dispcmd)
                print(e)
            return None
    else:
        if verbose:
            print("unable to find command, tried %s" % (commands,))
        return None
    stdout = p.communicate()[0].strip()
    if sys.version_info[0] >= 3:
        stdout = stdout.decode()
    if p.returncode != 0:
        if verbose:
            print("unable to run %s (error)" % dispcmd)
        return None
    return stdout


def versions_from_parentdir(parentdir_prefix, root, verbose):
    # Source tarballs conventionally unpack into a directory that includes
    # both the project name and a version string.
    dirname = os.path.basename(root)
    if not dirname.startswith(parentdir_prefix):
        if verbose:
            print("guessing rootdir is '%s', but '%s' doesn't start with "
                  "prefix '%s'" % (root, dirname, parentdir_prefix))
        raise NotThisMethod("rootdir doesn't start with parentdir_prefix")
    return {"version": dirname[len(parentdir_prefix):],
            "full-revisionid": None,
            "dirty": False, "error": None}


@register_vcs_handler("git", "get_keywords")
def git_get_keywords(versionfile_abs):
    # the code embedded in _version.py can just fetch the value of these
    # keywords. When used from setup.py, we don't want to import _version.py,
    # so we do it with a regexp instead. This function is not used from
    # _version.py.
    keywords = {}
    try:
        f = open(versionfile_abs, "r")
        for line in f.readlines():
            if line.strip().startswith("git_refnames ="):
                mo = re.search(r'=\s*"(.*)"', line)
                if mo:
                    keywords["refnames"] = mo.group(1)
            if line.strip().startswith("git_full ="):
                mo = re.search(r'=\s*"(.*)"', line)
                if mo:
                    keywords["full"] = mo.group(1)
        f.close()
    except EnvironmentError:
        pass
    return keywords


@register_vcs_handler("git", "keywords")
def git_versions_from_keywords(keywords, tag_prefix, verbose):
    if not keywords:
        raise NotThisMethod("no keywords at all, weird")
    refnames = keywords["refnames"].strip()
    if refnames.startswith("$Format"):
        if verbose:
            print("keywords are unexpanded, not using")
        raise NotThisMethod("unexpanded keywords, not a git-archive tarball")
    refs = set([r.strip() for r in refnames.strip("()").split(",")])
    # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
    # just "foo-1.0". If we see a "tag: " prefix, prefer those.
    TAG = "tag: "
    tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])
    if not tags:
        # Either we're using git < 1.8.3, or there really are no tags. We use
        # a heuristic: assume all version tags have a digit. The old git %d
        # expansion behaves like git log --decorate=short and strips out the
        # refs/heads/ and refs/tags/ prefixes that would let us distinguish
        # between branches and tags. By ignoring refnames without digits, we
        # filter out many common branch names like "release" and
        # "stabilization", as well as "HEAD" and "master".
        tags = set([r for r in refs if re.search(r'\d', r)])
        if verbose:
            print("discarding '%s', no digits" % ",".join(refs-tags))
    if verbose:
        print("likely tags: %s" % ",".join(sorted(tags)))
    for ref in sorted(tags):
        # sorting will prefer e.g. "2.0" over "2.0rc1"
        if ref.startswith(tag_prefix):
            r = ref[len(tag_prefix):]
            if verbose:
                print("picking %s" % r)
            return {"version": r,
                    "full-revisionid": keywords["full"].strip(),
                    "dirty": False, "error": None
                    }
    # no suitable tags, so version is "0+unknown", but full hex is still there
    if verbose:
        print("no suitable tags, using unknown + full revision id")
    return {"version": "0+unknown",
            "full-revisionid": keywords["full"].strip(),
            "dirty": False, "error": "no suitable tags"}


@register_vcs_handler("git", "pieces_from_vcs")
def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
    # this runs 'git' from the root of the source tree. This only gets called
    # if the git-archive 'subst' keywords were *not* expanded, and
    # _version.py hasn't already been rewritten with a short version string,
    # meaning we're inside a checked out source tree.

    if not os.path.exists(os.path.join(root, ".git")):
        if verbose:
            print("no .git in %s" % root)
        raise NotThisMethod("no .git directory")

    GITS = ["git"]
    if sys.platform == "win32":
        GITS = ["git.cmd", "git.exe"]
    # if there is a tag, this yields TAG-NUM-gHEX[-dirty]
    # if there are no tags, this yields HEX[-dirty] (no NUM)
    describe_out = run_command(GITS, ["describe", "--tags", "--dirty",
                                      "--always", "--long"],
                               cwd=root)
    # --long was added in git-1.5.5
    if describe_out is None:
        raise NotThisMethod("'git describe' failed")
    describe_out = describe_out.strip()
    full_out = run_command(GITS, ["rev-parse", "HEAD"], cwd=root)
    if full_out is None:
        raise NotThisMethod("'git rev-parse' failed")
    full_out = full_out.strip()

    pieces = {}
    pieces["long"] = full_out
    pieces["short"] = full_out[:7]  # maybe improved later
    pieces["error"] = None

    # parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty]
    # TAG might have hyphens.
    git_describe = describe_out

    # look for -dirty suffix
    dirty = git_describe.endswith("-dirty")
    pieces["dirty"] = dirty
    if dirty:
        git_describe = git_describe[:git_describe.rindex("-dirty")]

    # now we have TAG-NUM-gHEX or HEX

    if "-" in git_describe:
        # TAG-NUM-gHEX
        mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe)
        if not mo:
            # unparseable. Maybe git-describe is misbehaving?
            pieces["error"] = ("unable to parse git-describe output: '%s'"
                               % describe_out)
            return pieces

        # tag
        full_tag = mo.group(1)
        if not full_tag.startswith(tag_prefix):
            if verbose:
                fmt = "tag '%s' doesn't start with prefix '%s'"
                print(fmt % (full_tag, tag_prefix))
            pieces["error"] = ("tag '%s' doesn't start with prefix '%s'"
                               % (full_tag, tag_prefix))
            return pieces
        pieces["closest-tag"] = full_tag[len(tag_prefix):]

        # distance: number of commits since tag
        pieces["distance"] = int(mo.group(2))

        # commit: short hex revision ID
        pieces["short"] = mo.group(3)

    else:
        # HEX: no tags
        pieces["closest-tag"] = None
        count_out = run_command(GITS, ["rev-list", "HEAD", "--count"],
                                cwd=root)
        pieces["distance"] = int(count_out)  # total number of commits

    return pieces


def plus_or_dot(pieces):
    if "+" in pieces.get("closest-tag", ""):
        return "."
    return "+"


def render_pep440(pieces):
    # now build up version string, with post-release "local version
    # identifier". Our goal: TAG[+DISTANCE.gHEX[.dirty]] . Note that if you
    # get a tagged build and then dirty it, you'll get TAG+0.gHEX.dirty

    # exceptions:
    # 1: no tags. git_describe was just HEX. 0+untagged.DISTANCE.gHEX[.dirty]

    if pieces["closest-tag"]:
        rendered = pieces["closest-tag"]
        if pieces["distance"] or pieces["dirty"]:
            rendered += plus_or_dot(pieces)
            rendered += "%d.g%s" % (pieces["distance"], pieces["short"])
            if pieces["dirty"]:
                rendered += ".dirty"
    else:
        # exception #1
        rendered = "0+untagged.%d.g%s" % (pieces["distance"],
                                          pieces["short"])
        if pieces["dirty"]:
            rendered += ".dirty"
    return rendered


def render_pep440_pre(pieces):
    # TAG[.post.devDISTANCE] . No -dirty

    # exceptions:
    # 1: no tags. 0.post.devDISTANCE

    if pieces["closest-tag"]:
        rendered = pieces["closest-tag"]
        if pieces["distance"]:
            rendered += ".post.dev%d" % pieces["distance"]
    else:
        # exception #1
        rendered = "0.post.dev%d" % pieces["distance"]
    return rendered


def render_pep440_post(pieces):
    # TAG[.postDISTANCE[.dev0]+gHEX] . The ".dev0" means dirty. Note that
    # .dev0 sorts backwards (a dirty tree will appear "older" than the
    # corresponding clean one), but you shouldn't be releasing software with
    # -dirty anyways.

    # exceptions:
    # 1: no tags. 0.postDISTANCE[.dev0]

    if pieces["closest-tag"]:
        rendered = pieces["closest-tag"]
        if pieces["distance"] or pieces["dirty"]:
            rendered += ".post%d" % pieces["distance"]
            if pieces["dirty"]:
                rendered += ".dev0"
            rendered += plus_or_dot(pieces)
            rendered += "g%s" % pieces["short"]
    else:
        # exception #1
        rendered = "0.post%d" % pieces["distance"]
        if pieces["dirty"]:
            rendered += ".dev0"
        rendered += "+g%s" % pieces["short"]
    return rendered


def render_pep440_old(pieces):
    # TAG[.postDISTANCE[.dev0]] . The ".dev0" means dirty.

    # exceptions:
    # 1: no tags. 0.postDISTANCE[.dev0]

    if pieces["closest-tag"]:
        rendered = pieces["closest-tag"]
        if pieces["distance"] or pieces["dirty"]:
            rendered += ".post%d" % pieces["distance"]
            if pieces["dirty"]:
                rendered += ".dev0"
    else:
        # exception #1
        rendered = "0.post%d" % pieces["distance"]
        if pieces["dirty"]:
            rendered += ".dev0"
    return rendered


def render_git_describe(pieces):
    # TAG[-DISTANCE-gHEX][-dirty], like 'git describe --tags --dirty
    # --always'

    # exceptions:
    # 1: no tags. HEX[-dirty]  (note: no 'g' prefix)

    if pieces["closest-tag"]:
        rendered = pieces["closest-tag"]
        if pieces["distance"]:
            rendered += "-%d-g%s" % (pieces["distance"], pieces["short"])
    else:
        # exception #1
        rendered = pieces["short"]
    if pieces["dirty"]:
        rendered += "-dirty"
    return rendered


def render_git_describe_long(pieces):
    # TAG-DISTANCE-gHEX[-dirty], like 'git describe --tags --dirty
    # --always -long'. The distance/hash is unconditional.

    # exceptions:
    # 1: no tags. HEX[-dirty]  (note: no 'g' prefix)

    if pieces["closest-tag"]:
        rendered = pieces["closest-tag"]
        rendered += "-%d-g%s" % (pieces["distance"], pieces["short"])
    else:
        # exception #1
        rendered = pieces["short"]
    if pieces["dirty"]:
        rendered += "-dirty"
    return rendered


def render(pieces, style):
    if pieces["error"]:
        return {"version": "unknown",
                "full-revisionid": pieces.get("long"),
                "dirty": None,
                "error": pieces["error"]}

    if not style or style == "default":
        style = "pep440"  # the default

    if style == "pep440":
        rendered = render_pep440(pieces)
    elif style == "pep440-pre":
        rendered = render_pep440_pre(pieces)
    elif style == "pep440-post":
        rendered = render_pep440_post(pieces)
    elif style == "pep440-old":
        rendered = render_pep440_old(pieces)
    elif style == "git-describe":
        rendered = render_git_describe(pieces)
    elif style == "git-describe-long":
        rendered = render_git_describe_long(pieces)
    else:
        raise ValueError("unknown style '%s'" % style)

    return {"version": rendered, "full-revisionid": pieces["long"],
            "dirty": pieces["dirty"], "error": None}


def get_versions():
    # I am in _version.py, which lives at ROOT/VERSIONFILE_SOURCE. If we have
    # __file__, we can work backwards from there to the root. Some
    # py2exe/bbfreeze/non-CPython implementations don't do __file__, in which
    # case we can only use expanded keywords.

    cfg = get_config()
    verbose = cfg.verbose

    try:
        return git_versions_from_keywords(get_keywords(), cfg.tag_prefix,
                                          verbose)
    except NotThisMethod:
        pass

    try:
        root = os.path.realpath(__file__)
        # versionfile_source is the relative path from the top of the source
        # tree (where the .git directory might live) to this file. Invert
        # this to find the root from __file__.
        for i in cfg.versionfile_source.split('/'):
            root = os.path.dirname(root)
    except NameError:
        return {"version": "0+unknown", "full-revisionid": None,
                "dirty": None,
                "error": "unable to find root of source tree"}

    try:
        pieces = git_pieces_from_vcs(cfg.tag_prefix, root, verbose)
        return render(pieces, cfg.style)
    except NotThisMethod:
        pass

    try:
        if cfg.parentdir_prefix:
            return versions_from_parentdir(cfg.parentdir_prefix, root, verbose)
    except NotThisMethod:
        pass

    return {"version": "0+unknown", "full-revisionid": None,
            "dirty": None,
            "error": "unable to compute version"}


================================================
FILE: codetransformer/code.py
================================================
from collections import OrderedDict
from dis import Bytecode, dis, findlinestarts
from enum import IntEnum, unique
from functools import reduce
from itertools import repeat
import operator as op
import sys
from types import CodeType

from .instructions import (
    Instruction,
    LOAD_CONST,
    YIELD_FROM,
    YIELD_VALUE,
    _RawArg,
)
from .utils.functional import scanl, reverse_dict, ffill
from .utils.immutable import lazyval
from .utils.instance import instance


WORDCODE = sys.version_info >= (3, 6)
if WORDCODE:
    argsize = 1
    max_lnotab_increment = 127

    def _sparse_args(instrs):
        for instr in instrs:
            yield instr
            yield None

else:
    argsize = 2
    max_lnotab_increment = 255

    def _sparse_args(instrs):
        for instr in instrs:
            yield instr
            if instr.have_arg:
                yield None
                yield None


_sparse_args.__doc__ = """\
Makes the arguments sparse so that instructions live at the correct index for
the jump resolution step.

This pads the instruction set with None to mark the bytes occupied by
arguments.

Parameters
----------
instrs : iterable of Instruction
    The dense instruction set.

Yields
------
sparse : Instruction or None
    Yields the instructions, with objects marking the bytes that are used for
    arguments.
"""


@unique
class Flag(IntEnum):
    """
    An enum describing the bitmask of flags that can be set on a code object.
    """
    # These enum values and comments are taken from CPython.
    CO_OPTIMIZED = 0x0001
    CO_NEWLOCALS = 0x0002
    CO_VARARGS = 0x0004
    CO_VARKEYWORDS = 0x0008
    CO_NESTED = 0x0010
    CO_GENERATOR = 0x0020

    # The CO_NOFREE flag is set if there are no free or cell variables.
    # This information is redundant, but it allows a single flag test
    # to determine whether there is any extra work to be done when the
    # call frame it setup.
    CO_NOFREE = 0x0040

    # The CO_COROUTINE flag is set for coroutines creates with the
    # types.coroutine decorator. This converts old-style coroutines into
    # python3.5 style coroutines.
    CO_COROUTINE = 0x0080
    CO_ITERABLE_COROUTINE = 0x0100

    # Old values:
    CO_FUTURE_DIVISION = 0x2000
    CO_FUTURE_ABSOLUTE_IMPORT = 0x4000  # Do absolute imports by default.
    CO_FUTURE_WITH_STATEMENT = 0x8000
    CO_FUTURE_PRINT_FUNCTION = 0x10000
    CO_FUTURE_UNICODE_LITERALS = 0x20000

    CO_FUTURE_BARRY_AS_BDFL = 0x40000
    CO_FUTURE_GENERATOR_STOP = 0x80000

    @instance
    class max:
        """The largest bitmask that represents a valid flag.
        """
        def __get__(self, instance, owner):
            return owner.pack(**dict(zip(owner.__members__, repeat(True))))

        def __set__(self, instance, value):
            raise AttributeError("can't set 'max' attribute")

    @classmethod
    def pack(cls,
             *,
             CO_OPTIMIZED,
             CO_NEWLOCALS,
             CO_VARARGS,
             CO_VARKEYWORDS,
             CO_NESTED,
             CO_GENERATOR,
             CO_NOFREE,
             CO_COROUTINE,
             CO_ITERABLE_COROUTINE,
             CO_FUTURE_DIVISION,
             CO_FUTURE_ABSOLUTE_IMPORT,
             CO_FUTURE_WITH_STATEMENT,
             CO_FUTURE_PRINT_FUNCTION,
             CO_FUTURE_UNICODE_LITERALS,
             CO_FUTURE_BARRY_AS_BDFL,
             CO_FUTURE_GENERATOR_STOP):
        """Pack a flags into a bitmask.

        I hope you like kwonly args.

        Parameters
        ----------
        CO_OPTIMIZED : bool
        CO_NEWLOCALS : bool
        CO_VARARGS : bool
        CO_VARKEYWORDS : bool
        CO_NESTED : bool
        CO_GENERATOR : bool
        CO_NOFREE : bool
        CO_COROUTINE : bool
        CO_ITERABLE_COROUTINE : bool
        CO_FUTURE_DIVISION : bool
        CO_FUTURE_ABSOLUTE_IMPORT : bool
        CO_FUTURE_WITH_STATEMENT : bool
        CO_FUTURE_PRINT_FUNCTION : bool
        CO_FUTURE_UNICODE_LITERALS : bool
        CO_FUTURE_BARRY_AS_BDFL : bool
        CO_FUTURE_GENERATOR_STOP : bool

        Returns
        -------
        mask : int

        See Also
        --------
        codetransformer.code.Flag.unpack
        """
        ls = locals()
        return reduce(
            op.or_,
            (v for k, v in cls.__members__.items() if ls[k]),
            0,
        )

    @classmethod
    def unpack(cls, mask):
        """Unpack a bitmask into a map of flag to bool.

        Parameters
        ----------
        mask : int
            A bitmask

        Returns
        -------
        mapping : OrderedDict[str -> bool]
            The mapping of flag name to flag status.

        See Also
        --------
        codetransformer.code.Flag.pack
        """
        if mask > cls.max:
            raise ValueError('Invalid mask, too large: %d' % mask)

        return OrderedDict(
            (k, bool(mask & getattr(cls, k)))
            for k, v in cls.__members__.items()
        )


def _freevar_argname(arg, cellvars, freevars):
    """
    Get the name of the variable manipulated by a 'uses_free' instruction.

    Parameters
    ----------
    arg : int
        The raw argument to a uses_free instruction that we want to resolve to
        a name.
    cellvars : list[str]
        The co_cellvars of the function for which we want to resolve `arg`.
    freevars : list[str]
        The co_freevars of the function for which we want to resolve `arg`.

    Notes
    -----
    From https://docs.python.org/3.5/library/dis.html#opcode-LOAD_CLOSURE:

        The name of the variable is co_cellvars[i] if i is less than the length
        of co_cellvars. Otherwise it is co_freevars[i - len(co_cellvars)]
    """
    len_cellvars = len(cellvars)
    if arg < len_cellvars:
        return cellvars[arg]
    return freevars[arg - len_cellvars]


def pycode(argcount,
           kwonlyargcount,
           nlocals,
           stacksize,
           flags,
           codestring,
           constants,
           names,
           varnames,
           filename,
           name,
           firstlineno,
           lnotab,
           freevars=(),
           cellvars=()):
    """types.CodeType constructor that accepts keyword arguments.

    See Also
    --------
    types.CodeType
    """
    return CodeType(
        argcount,
        kwonlyargcount,
        nlocals,
        stacksize,
        flags,
        codestring,
        constants,
        names,
        varnames,
        filename,
        name,
        firstlineno,
        lnotab,
        freevars,
        cellvars,
    )


class Code:
    """A higher abstraction over python's CodeType.

    See Include/code.h for more information.

    Parameters
    ----------
    instrs : iterable of Instruction
        A sequence of codetransformer Instruction objects.
    argnames : iterable of str, optional
        The names of the arguments to the code object.
    name : str, optional
        The name of this code object.
    filename : str, optional
        The file that this code object came from.
    firstlineno : int, optional
        The first line number of the code in this code object.
    lnotab : dict[Instruction -> int], optional
        The mapping from instruction to the line that it starts.
    flags : dict[str -> bool], optional
        Any flags to set. This updates the default flag set.

    Attributes
    ----------
    argcount
    argnames
    cellvars
    constructs_new_locals
    consts
    filename
    flags
    freevars
    instrs
    is_coroutine
    is_generator
    is_iterable_coroutine
    is_nested
    kwonlyargcount
    lnotab
    name
    names
    py_lnotab
    sparse_instrs
    stacksize
    varnames
    """
    __slots__ = (
        '_instrs',
        '_argnames',
        '_argcount',
        '_kwonlyargcount',
        '_cellvars',
        '_freevars',
        '_name',
        '_filename',
        '_firstlineno',
        '_lnotab',
        '_flags',
        '__weakref__',
    )

    def __init__(self,
                 instrs,
                 argnames=(),
                 *,
                 cellvars=(),
                 freevars=(),
                 name='<code>',
                 filename='<code>',
                 firstlineno=1,
                 lnotab=None,
                 flags=None):

        instrs = tuple(instrs)  # strictly evaluate any generators.

        # The starting varnames (the names of the arguments to the function)
        argcount = [0]
        kwonlyargcount = [0]
        argcounter = argcount  # Which set of args are we currently counting.
        _argnames = []
        append_argname = _argnames.append
        varg = kwarg = None
        for argname in argnames:
            if argname.startswith('**'):
                if kwarg is not None:
                    raise ValueError('cannot specify **kwargs more than once')
                kwarg = argname[2:]
                continue
            elif argname.startswith('*'):
                if varg is not None:
                    raise ValueError('cannot specify *args more than once')
                varg = argname[1:]
                argcounter = kwonlyargcount  # all following args are kwonly.
                continue
            argcounter[0] += 1
            append_argname(argname)

        if varg is not None:
            append_argname(varg)
        if kwarg is not None:
            append_argname(kwarg)

        cellvar_names = set(cellvars)
        freevar_names = set(freevars)
        for instr in filter(op.attrgetter('uses_free'), instrs):
            if instr.arg in cellvar_names:
                instr._vartype = 'cell'
            elif instr.arg in freevar_names:
                instr._vartype = 'free'
            else:
                raise ValueError(
                    "Argument to %r is not in cellvars or freevars." % instr
                )

        for instr in filter(op.attrgetter('is_jmp'), instrs):
            instr.arg._target_of.add(instr)

        self._instrs = instrs
        self._argnames = tuple(_argnames)
        self._argcount = argcount[0]
        self._kwonlyargcount = kwonlyargcount[0]
        self._cellvars = cellvars
        self._freevars = freevars
        self._name = name
        self._filename = filename
        self._firstlineno = firstlineno
        self._lnotab = lnotab or {}
        self._flags = Flag.pack(**dict(
            dict(
                CO_OPTIMIZED=True,
                CO_NEWLOCALS=True,
                CO_VARARGS=varg is not None,
                CO_VARKEYWORDS=kwarg is not None,
                CO_NESTED=False,
                CO_GENERATOR=any(
                    isinstance(instr, (YIELD_VALUE, YIELD_FROM))
                    for instr in instrs
                ),
                CO_NOFREE=not any(map(op.attrgetter('uses_free'), instrs)),
                CO_COROUTINE=False,
                CO_ITERABLE_COROUTINE=False,
                CO_FUTURE_DIVISION=False,
                CO_FUTURE_ABSOLUTE_IMPORT=False,
                CO_FUTURE_WITH_STATEMENT=False,
                CO_FUTURE_PRINT_FUNCTION=False,
                CO_FUTURE_UNICODE_LITERALS=False,
                CO_FUTURE_BARRY_AS_BDFL=False,
                CO_FUTURE_GENERATOR_STOP=False,
            ),
            **flags or {}
        ))

    @classmethod
    def from_pyfunc(cls, f):
        """Create a Code object from a python function object.

        Parameters
        ----------
        f : function
            The function from which to construct a code object.

        Returns
        -------
        code : Code
            A Code object representing f.__code__.
        """
        return cls.from_pycode(f.__code__)

    @classmethod
    def from_pycode(cls, co):
        """Create a Code object from a python code object.

        Parameters
        ----------
        co : CodeType
            The python code object.

        Returns
        -------
        code : Code
            The codetransformer Code object.
        """
        # Make it sparse to instrs[n] is the instruction at bytecode[n]
        sparse_instrs = tuple(
            _sparse_args(
                Instruction.from_opcode(
                    b.opcode,
                    Instruction._no_arg if b.arg is None else _RawArg(b.arg),
                ) for b in Bytecode(co)
            ),
        )
        for idx, instr in enumerate(sparse_instrs):
            if instr is None:
                # The sparse value
                continue
            if instr.absjmp:
                instr.arg = sparse_instrs[instr.arg]
            elif instr.reljmp:
                instr.arg = sparse_instrs[instr.arg + idx + argsize + 1]
            elif isinstance(instr, LOAD_CONST):
                instr.arg = co.co_consts[instr.arg]
            elif instr.uses_name:
                instr.arg = co.co_names[instr.arg]
            elif instr.uses_varname:
                instr.arg = co.co_varnames[instr.arg]
            elif instr.uses_free:
                instr.arg = _freevar_argname(
                    instr.arg,
                    co.co_freevars,
                    co.co_cellvars,
                )
            elif instr.have_arg and isinstance(instr.arg, _RawArg):
                instr.arg = int(instr.arg)

        flags = Flag.unpack(co.co_flags)
        has_vargs = flags['CO_VARARGS']
        has_kwargs = flags['CO_VARKEYWORDS']

        # Here we convert the varnames format into our argnames format.
        paramnames = co.co_varnames[
            :(co.co_argcount +
              co.co_kwonlyargcount +
              has_vargs +
              has_kwargs)
        ]
        # We start with the positional arguments.
        new_paramnames = list(paramnames[:co.co_argcount])
        # Add *args next.
        if has_vargs:
            new_paramnames.append('*' + paramnames[-1 - has_kwargs])
        # Add positional only arguments next.
        new_paramnames.extend(paramnames[
            co.co_argcount:co.co_argcount + co.co_kwonlyargcount
        ])
        # Add **kwargs last.
        if has_kwargs:
            new_paramnames.append('**' + paramnames[-1])

        return cls(
            filter(bool, sparse_instrs),
            argnames=new_paramnames,
            cellvars=co.co_cellvars,
            freevars=co.co_freevars,
            name=co.co_name,
            filename=co.co_filename,
            firstlineno=co.co_firstlineno,
            lnotab={
                lno: sparse_instrs[off] for off, lno in findlinestarts(co)
            },
            flags=flags,
        )

    def to_pycode(self):
        """Create a python code object from the more abstract
        codetransfomer.Code object.

        Returns
        -------
        co : CodeType
            The python code object.
        """
        consts = self.consts
        names = self.names
        varnames = self.varnames
        freevars = self.freevars
        cellvars = self.cellvars
        bc = bytearray()
        for instr in self.instrs:
            bc.append(instr.opcode)  # Write the opcode byte.
            if isinstance(instr, LOAD_CONST):
                # Resolve the constant index.
                bc.extend(consts.index(instr.arg).to_bytes(argsize, 'little'))
            elif instr.uses_name:
                # Resolve the name index.
                bc.extend(names.index(instr.arg).to_bytes(argsize, 'little'))
            elif instr.uses_varname:
                # Resolve the local variable index.
                bc.extend(
                    varnames.index(instr.arg).to_bytes(argsize, 'little'),
                )
            elif instr.uses_free:
                # uses_free is really "uses freevars **or** cellvars".
                try:
                    # look for the name in cellvars
                    bc.extend(
                        cellvars.index(instr.arg).to_bytes(argsize, 'little'),
                    )
                except ValueError:
                    # fall back to freevars, incrementing the length of
                    # cellvars.
                    bc.extend(
                        (freevars.index(instr.arg) + len(cellvars)).to_bytes(
                            argsize,
                            'little',
                        )
                    )
            elif instr.absjmp:
                # Resolve the absolute jump target.
                bc.extend(
                    self.bytecode_offset(instr.arg).to_bytes(
                        argsize,
                        'little',
                    ),
                )
            elif instr.reljmp:
                # Resolve the relative jump target.
                # We do this by subtracting the curren't instructions's
                # sparse index from the sparse index of the argument.
                # We then subtract argsize - 1 to account for the bytes the
                # current instruction takes up.
                bytecode_offset = self.bytecode_offset
                bc.extend((
                    bytecode_offset(instr.arg) -
                    bytecode_offset(instr) -
                    argsize -
                    1
                ).to_bytes(argsize, 'little',))
            elif instr.have_arg:
                # Write any other arg here.
                bc.extend(instr.arg.to_bytes(argsize, 'little'))
            elif WORDCODE:
                # with wordcode, all instructions are padded to 2 bytes
                bc.append(0)

        return CodeType(
            self.argcount,
            self.kwonlyargcount,
            len(varnames),
            self.stacksize,
            self.py_flags,
            bytes(bc),
            consts,
            names,
            varnames,
            self.filename,
            self.name,
            self.firstlineno,
            self.py_lnotab,
            freevars,
            cellvars,
        )

    @property
    def instrs(self):
        """The instructions in this code object.
        """
        return self._instrs

    @property
    def sparse_instrs(self):
        """The instructions where the index of an instruction
        is the bytecode offset of that instruction.

        None indicates that no instruction is at that offset.
        """
        return tuple(_sparse_args(self.instrs))

    @property
    def argcount(self):
        """The number of arguments this code object accepts.

        This does not include varargs (\*args).
        """
        return self._argcount

    @property
    def kwonlyargcount(self):
        """The number of keyword only arguments this code object accepts.

        This does not include varkwargs (\*\*kwargs).
        """
        return self._kwonlyargcount

    @property
    def consts(self):
        """The constants referenced in this code object.
        """
        # We cannot use a set comprehension because consts do not need
        # to be hashable.
        consts = []
        append_const = consts.append
        for instr in self.instrs:
            if isinstance(instr, LOAD_CONST) and instr.arg not in consts:
                append_const(instr.arg)
        return tuple(consts)

    @property
    def names(self):
        """The names referenced in this code object.

        Names come from instructions like LOAD_GLOBAL or STORE_ATTR
        where the name of the global or attribute is needed at runtime.
        """
        # We must sort to preserve the order between calls.
        # The set comprehension is to drop the duplicates.
        return tuple(sorted({
            instr.arg for instr in self.instrs if instr.uses_name
        }))

    @property
    def argnames(self):
        """The names of the arguments to this code object.

        The format is: [args] [vararg] [kwonlyargs] [varkwarg]
        where each group is optional.
        """
        return self._argnames

    @property
    def varnames(self):
        """The names of all of the local variables in this code object.
        """
        # We must sort to preserve the order between calls.
        # The set comprehension is to drop the duplicates.
        return self._argnames + tuple(sorted({
            instr.arg
            for instr in self.instrs
            if instr.uses_varname and instr.arg not in self._argnames
        }))

    @property
    def cellvars(self):
        """The names of the variables closed over by inner code objects.
        """
        return self._cellvars

    @property
    def freevars(self):
        """The names of the variables this code object has closed over.
        """
        return self._freevars

    @property
    def flags(self):
        """The flags of this code object represented as a mapping from flag
        name to boolean status.

        Notes
        -----
        This is a copy of the underlying flags. Mutations will not affect
        the code object.
        """
        return Flag.unpack(self._flags)

    @property
    def py_flags(self):
        """The flags of this code object represented as a bitmask.
        """
        return self._flags

    @property
    def is_nested(self):
        """Is this a nested code object?
        """
        return bool(self._flags & Flag.CO_NESTED)

    @property
    def is_generator(self):
        """Is this a generator?
        """
        return bool(self._flags & Flag.CO_GENERATOR)

    @property
    def is_coroutine(self):
        """Is this a coroutine defined with async def?

        This is 3.5 and greater.
        """
        return bool(self._flags & Flag.CO_COROUTINE)

    @property
    def is_iterable_coroutine(self):
        """Is this an async generator defined with types.coroutine?

        This is 3.5 and greater.
        """
        return bool(self._flags & Flag.CO_ITERABLE_COROUTINE)

    @property
    def constructs_new_locals(self):
        """Does this code object construct new locals?

        This is True for things like functions where executing the code
        needs a new locals dict each time; however, something like a module
        does not normally need new locals.
        """
        return bool(self._flags & Flag.CO_NEWLOCALS)

    @property
    def filename(self):
        """The filename of this code object.
        """
        return self._filename

    @property
    def name(self):
        """The name of this code object.
        """
        return self._name

    @property
    def firstlineno(self):
        """The first source line from self.filename
        that this code object represents.
        """
        return self._firstlineno

    @property
    def lnotab(self):
        """The mapping of line number to the first instruction on that line.
        """
        return self._lnotab

    @lazyval
    def lno_of_instr(self):
        instrs = self.instrs
        lnos = [None] * len(instrs)
        reverse_lnotab = reverse_dict(self.lnotab)
        for n, instr in enumerate(instrs):
            lnos[n] = reverse_lnotab.get(instr)
        return dict(zip(instrs, ffill(lnos)))

    @property
    def py_lnotab(self):
        """The encoded lnotab that python uses to compute when lines start.

        Note
        ----
        See Objects/lnotab_notes.txt in the cpython source for more details.
        """
        reverse_lnotab = reverse_dict(self.lnotab)
        py_lnotab = []
        prev_instr = 0
        prev_lno = self.firstlineno
        for addr, instr in enumerate(_sparse_args(self.instrs)):
            lno = reverse_lnotab.get(instr)
            if lno is None:
                continue

            delta = lno - prev_lno
            py_lnotab.append(addr - prev_instr)
            py_lnotab.append(min(delta, max_lnotab_increment))
            delta -= max_lnotab_increment
            while delta > 0:
                py_lnotab.append(0)
                py_lnotab.append(min(delta, max_lnotab_increment))
                delta -= max_lnotab_increment

            prev_lno = lno
            prev_instr = addr

        return bytes(py_lnotab)

    @property
    def stacksize(self):
        """The maximum amount of stack space used by this code object.
        """
        return max(scanl(
            op.add,
            0,
            map(op.attrgetter('stack_effect'), self.instrs),
        ))

    def index(self, instr):
        """Returns the index of instr.

        Parameters
        ----------
        instr : Instruction
            The instruction the check the index of.

        Returns
        -------
        idx : int
            The index of instr in this code object.
        """
        return self.instrs.index(instr)

    def bytecode_offset(self, instr):
        """Returns the offset of instr in the bytecode representation.

        Parameters
        ----------
        instr : Instruction
            The instruction the check the index of.

        Returns
        -------
        idx : int
            The index of instr in this code object in the sparse instructions.
        """
        return self.sparse_instrs.index(instr)

    def __getitem__(self, key):
        return self.instrs[key]

    def __iter__(self):
        return iter(self.instrs)

    def __len__(self):
        return len(self.instrs)

    def __contains__(self, instr):
        return instr in self.instrs

    def dis(self, file=None):
        """
        Print self via the stdlib ``dis`` module.

        Parameters
        ----------
        file : file-like, optional
            A file-like object into which we should print.
            Defaults to sys.stdout.
        """
        dis(self.to_pycode(), file=file)


================================================
FILE: codetransformer/core.py
================================================
from collections import OrderedDict
from contextlib import contextmanager
from ctypes import py_object, pythonapi
from itertools import chain
from types import CodeType, FunctionType
from weakref import WeakKeyDictionary

try:
    import threading
except ImportError:
    import dummy_threading as threading

from .code import Code
from .instructions import LOAD_CONST, STORE_FAST, LOAD_FAST
from .patterns import (
    boundpattern,
    patterndispatcher,
    DEFAULT_STARTCODE,
)
from .utils.instance import instance


_cell_new = pythonapi.PyCell_New
_cell_new.argtypes = (py_object,)
_cell_new.restype = py_object


def _a_if_not_none(a, b):
    return a if a is not None else b


def _new_lnotab(instrs, lnotab):
    """The updated lnotab after the instructions have been transformed.

    Parameters
    ----------
    instrs : iterable[Instruction]
        The new instructions.
    lnotab : dict[Instruction -> int]
        The lnotab for the old code object.

    Returns
    -------
    new_lnotab : dict[Instruction -> int]
        The post transform lnotab.
    """
    return {
        lno: _a_if_not_none(instr._stolen_by, instr)
        for lno, instr in lnotab.items()
    }


class NoContext(Exception):
    """Exception raised to indicate that the ``code` or ``startcode``
    attribute was accessed outside of a code context.
    """
    def __init__(self):
        return super().__init__('no active transformation context')


class Context:
    """Empty object for holding the transformation context.
    """
    def __init__(self, code):
        self.code = code
        self.startcode = DEFAULT_STARTCODE

    def __repr__(self):  # pragma: no cover
        return '<%s: %r>' % (type(self).__name__, self.__dict__)


class CodeTransformerMeta(type):
    """Meta class for CodeTransformer to collect all of the patterns
    and ensure the class dict is ordered.

    Patterns are created when a method is decorated with
    ``codetransformer.pattern.pattern``
    """
    def __new__(mcls, name, bases, dict_):
        dict_['patterndispatcher'] = patterndispatcher(*chain(
            (v for v in dict_.values() if isinstance(v, boundpattern)),
            *(
                d and d.patterns for d in (
                    getattr(b, 'patterndispatcher', ()) for b in bases
                )
            )
        ))
        return super().__new__(mcls, name, bases, dict_)

    def __prepare__(self, bases):
        return OrderedDict()


class CodeTransformer(metaclass=CodeTransformerMeta):
    """A code object transformer, similar to the NodeTransformer
    from the ast module.

    Attributes
    ----------
    code
    """
    __slots__ = '__weakref__',

    def transform_consts(self, consts):
        """transformer for the co_consts field.

        Override this method to transform the `co_consts` of the code object.

        Parameters
        ----------
        consts : tuple
            The co_consts

        Returns
        -------
        new_consts : tuple
            The new constants.
        """
        return tuple(
            self.transform(Code.from_pycode(const)).to_pycode()
            if isinstance(const, CodeType) else
            const
            for const in consts
        )

    def _id(self, obj):
        """Identity function.

        Parameters
        ----------
        obj : any
            The object to return

        Returns
        -------
        obj : any
            The input unchanged
        """
        return obj

    transform_name = _id
    transform_names = _id
    transform_varnames = _id
    transform_freevars = _id
    transform_cellvars = _id
    transform_defaults = _id

    del _id

    def transform(self, code, *, name=None, filename=None):
        """Transform a codetransformer.Code object applying the transforms.

        Parameters
        ----------
        code : Code
            The code object to transform.
        name : str, optional
            The new name for this code object.
        filename : str, optional
            The new filename for this code object.

        Returns
        -------
        new_code : Code
            The transformed code object.
        """
        # reverse lookups from for constants and names.
        reversed_consts = {}
        reversed_names = {}
        reversed_varnames = {}
        for instr in code:
            if isinstance(instr, LOAD_CONST):
                reversed_consts[instr] = instr.arg
            if instr.uses_name:
                reversed_names[instr] = instr.arg
            if isinstance(instr, (STORE_FAST, LOAD_FAST)):
                reversed_varnames[instr] = instr.arg

        instrs, consts = tuple(zip(*reversed_consts.items())) or ((), ())
        for instr, const in zip(instrs, self.transform_consts(consts)):
            instr.arg = const

        instrs, names = tuple(zip(*reversed_names.items())) or ((), ())
        for instr, name_ in zip(instrs, self.transform_names(names)):
            instr.arg = name_

        instrs, varnames = tuple(zip(*reversed_varnames.items())) or ((), ())
        for instr, varname in zip(instrs, self.transform_varnames(varnames)):
            instr.arg = varname

        with self._new_context(code):
            post_transform = self.patterndispatcher(code)

            return Code(
                post_transform,
                code.argnames,
                cellvars=self.transform_cellvars(code.cellvars),
                freevars=self.transform_freevars(code.freevars),
                name=name if name is not None else code.name,
                filename=filename if filename is not None else code.filename,
                firstlineno=code.firstlineno,
                lnotab=_new_lnotab(post_transform, code.lnotab),
                flags=code.flags,
            )

    def __call__(self, f, *,
                 globals_=None, name=None, defaults=None, closure=None):
        # Callable so that we can use CodeTransformers as decorators.
        if closure is not None:
            closure = tuple(map(_cell_new, closure))
        else:
            closure = f.__closure__

        return FunctionType(
            self.transform(Code.from_pycode(f.__code__)).to_pycode(),
            _a_if_not_none(globals_, f.__globals__),
            _a_if_not_none(name, f.__name__),
            _a_if_not_none(defaults, f.__defaults__),
            closure,
        )

    @instance
    class _context_stack(threading.local):
        """Thread safe transformation context stack.

        Each thread will get it's own ``WeakKeyDictionary`` that maps
        instances to a stack of ``Context`` objects. When this descriptor
        is looked up we first try to get the weakkeydict off of the thread
        local storage. If it doesn't exist we make a new map. Then we lookup
        our instance in this map. If it doesn't exist yet create a new stack
        (as an empty list).

        This allows a single instance of ``CodeTransformer`` to be used
        recursively to transform code objects in a thread safe way while
        still being able to use a stateful context.
        """
        def __get__(self, instance, owner):
            try:
                stacks = self._context_stacks
            except AttributeError:
                stacks = self._context_stacks = WeakKeyDictionary()

            if instance is None:
                # when looked up off the class return the current threads
                # context stacks map
                return stacks

            return stacks.setdefault(instance, [])

    @contextmanager
    def _new_context(self, code):
        self._context_stack.append(Context(code))
        try:
            yield
        finally:
            self._context_stack.pop()

    @property
    def context(self):
        """Lookup the current transformation context.

        Raises
        ------
        NoContext
            Raised when there is no active transformation context.
        """
        try:
            return self._context_stack[-1]
        except IndexError:
            raise NoContext()

    @property
    def code(self):
        """The code object we are currently manipulating.
        """
        return self.context.code

    @property
    def startcode(self):
        """The startcode we are currently in.
        """
        return self.context.startcode

    def begin(self, startcode):
        """Begin a new startcode.

        Parameters
        ----------
        startcode : any
            The startcode to begin.
        """
        self.context.startcode = startcode


================================================
FILE: codetransformer/decompiler/_343.py
================================================
import ast
from collections import deque
from functools import singledispatch
from itertools import takewhile
import types

from toolz import complement, compose, curry, sliding_window
import toolz.curried.operator as op

from . import paramnames
from ..code import Code
from .. import instructions as instrs
from ..utils.functional import not_a, is_a
from ..utils.immutable import immutable
from codetransformer import a as showa, d as showd  # noqa


__all__ = [
    'DecompilationContext',
    'DecompilationError',
    'decompile',
    'pycode_to_body',
]


class DecompilationError(Exception):
    pass


class DecompilationContext(immutable,
                           defaults={
                               "in_function_block": False,
                               "in_lambda": False,
                               "make_function_context": None,
                               "top_of_loop": None}):

    """
    Value representing the context of the current decompilation run.
    """
    __slots__ = (
        'in_function_block',
        'in_lambda',
        'make_function_context',
        'top_of_loop',
    )


class MakeFunctionContext(immutable):
    __slots__ = ('closure',)


def decompile(f):
    """
    Decompile a function.

    Parameters
    ----------
    f : function
        The function to decompile.

    Returns
    -------
    ast : ast.FunctionDef
        A FunctionDef node that compiles to f.
    """
    co = f.__code__
    args, kwonly, varargs, varkwargs = paramnames(co)
    annotations = f.__annotations__ or {}
    defaults = list(f.__defaults__ or ())
    kw_defaults = f.__kwdefaults__ or {}

    if f.__name__ == '<lambda>':
        node = ast.Lambda
        body = pycode_to_body(co, DecompilationContext(in_lambda=True))[0]
        extra_kwargs = {}
    else:
        node = ast.FunctionDef
        body = pycode_to_body(co, DecompilationContext(in_function_block=True))
        extra_kwargs = {
            'decorator_list': [],
            'returns': annotations.get('return')
        }

    return node(
        name=f.__name__,
        args=make_function_arguments(
            args=args,
            kwonly=kwonly,
            varargs=varargs,
            varkwargs=varkwargs,
            defaults=defaults,
            kw_defaults=kw_defaults,
            annotations=annotations,
        ),
        body=body,
        **extra_kwargs
    )


def pycode_to_body(co, context):
    """
    Convert a Python code object to a list of AST body elements.
    """
    code = Code.from_pycode(co)

    # On each instruction, temporarily store all the jumps to the **next**
    # instruction.  This is used in _make_expr to determine when an expression
    # is part of a short-circuiting expression.
    for a, b in sliding_window(2, code.instrs):
        a._next_target_of = b._target_of
    b._next_target_of = set()

    try:
        body = instrs_to_body(deque(code.instrs), context)
        if context.in_function_block:
            return make_global_and_nonlocal_decls(code.instrs) + body
        return body
    finally:
        # Clean up jump target data.
        for i in code.instrs:
            del i._next_target_of


def instrs_to_body(instrs, context):
    """
    Convert a list of Instruction objects to a list of AST body nodes.
    """
    stack = []
    body = []
    process_instrs(instrs, stack, body, context)

    if stack:
        raise DecompilationError(
            "Non-empty stack at the end of instrs_to_body(): %s." % stack
        )
    return body


def process_instrs(queue, stack, body, context):
    """
    Process instructions from the instruction queue.
    """
    next_instr = queue.popleft
    while queue:
        newcontext = _process_instr(next_instr(), queue, stack, body, context)
        if newcontext is not None:
            context = newcontext


@singledispatch
def _process_instr(instr, queue, stack, body, context):
    raise AssertionError(
        "process_instr() passed a non-instruction argument %s" % type(instr)
    )


@_process_instr.register(instrs.Instruction)
def _instr(instr, queue, stack, body, context):
    raise DecompilationError(
        "Don't know how to decompile instructions of type %s" % type(instr)
    )


@_process_instr.register(instrs.POP_JUMP_IF_TRUE)
@_process_instr.register(instrs.POP_JUMP_IF_FALSE)
def _process_jump(instr, queue, stack, body, context):
    stack_effect_until_target = sum(
        map(
            op.attrgetter('stack_effect'),
            takewhile(op.is_not(instr.arg), queue)
        )
    )
    if stack_effect_until_target == 0:
        body.append(make_if_statement(instr, queue, stack, context))
        return
    else:
        raise DecompilationError(
            "Don't know how to decompile `and`/`or`/`ternary` exprs."
        )


def make_if_statement(instr, queue, stack, context):
    """
    Make an ast.If block from a POP_JUMP_IF_TRUE or POP_JUMP_IF_FALSE.
    """
    test_expr = make_expr(stack)
    if isinstance(instr, instrs.POP_JUMP_IF_TRUE):
        test_expr = ast.UnaryOp(op=ast.Not(), operand=test_expr)

    first_block = popwhile(op.is_not(instr.arg), queue, side='left')
    if isinstance(first_block[-1], instrs.RETURN_VALUE):
        body = instrs_to_body(first_block, context)
        return ast.If(test=test_expr, body=body, orelse=[])

    jump_to_end = expect(
        first_block.pop(), instrs.JUMP_FORWARD, "at end of if-block"
    )

    body = instrs_to_body(first_block, context)

    # First instruction after the whole if-block.
    end = jump_to_end.arg
    if instr.arg is jump_to_end.arg:
        orelse = []
    else:
        orelse = instrs_to_body(
            popwhile(op.is_not(end), queue, side='left'),
            context,
        )

    return ast.If(test=test_expr, body=body, orelse=orelse)


@_process_instr.register(instrs.EXTENDED_ARG)
def _process_instr_extended_arg(instr, queue, stack, body, context):
    """We account for EXTENDED_ARG when constructing Code objects."""
    pass


@_process_instr.register(instrs.UNPACK_SEQUENCE)
def _process_instr_unpack_sequence(instr, queue, stack, body, context):
    body.append(make_assignment(instr, queue, stack))


@_process_instr.register(instrs.IMPORT_NAME)
def _process_instr_import_name(instr, queue, stack, body, context):
    """
    Process an IMPORT_NAME instruction.

    Side Effects
    ------------
    Pops two instuctions from `stack`
    Consumes instructions from `queue` to the end of the import statement.
    Appends an ast.Import or ast.ImportFrom node to `body`.
    """
    # If this is "import module", fromlist is None.
    # If this this is "from module import a, b fromlist will be ('a', 'b').
    fromlist = stack.pop().arg

    # level argument to __import__.  Should be 0, 1, or 2.
    level = stack.pop().arg

    module = instr.arg
    if fromlist is None:  # Regular import.
        attr_loads = _pop_import_LOAD_ATTRs(module, queue)
        store = queue.popleft()
        # There are two cases where we should emit an alias:
        # import a as <anything but a>
        # import a.b.c as <anything (including a)>
        if attr_loads or module.split('.')[0] != store.arg:
            asname = store.arg
        else:
            asname = None
        body.append(
            ast.Import(
                names=[
                    ast.alias(
                        name=module,
                        asname=(asname),
                    ),
                ],
                level=level,
            ),
        )
        return
    elif fromlist == ('*',):  # From module import *.
        expect(queue.popleft(), instrs.IMPORT_STAR, "after IMPORT_NAME")
        body.append(
            ast.ImportFrom(
                module=module,
                names=[ast.alias(name='*', asname=None)],
                level=level,
            ),
        )
        return

    # Consume a pair of IMPORT_FROM, STORE_NAME instructions for each entry in
    # fromlist.
    names = list(map(make_importfrom_alias(queue, body, context), fromlist))
    body.append(ast.ImportFrom(module=module, names=names, level=level))

    # Remove the final POP_TOP of the imported module.
    expect(queue.popleft(), instrs.POP_TOP, "after 'from import'")


def _pop_import_LOAD_ATTRs(module_name, queue):
    """
    Pop LOAD_ATTR instructions for an import of the form::

        import a.b.c as d

    which should generate bytecode like this::

        1           0 LOAD_CONST               0 (0)
                    3 LOAD_CONST               1 (None)
                    6 IMPORT_NAME              0 (a.b.c.d)
                    9 LOAD_ATTR                1 (b)
                   12 LOAD_ATTR                2 (c)
                   15 LOAD_ATTR                3 (d)
                   18 STORE_NAME               3 (d)
    """
    popped = popwhile(is_a(instrs.LOAD_ATTR), queue, side='left')
    if popped:
        expected = module_name.split('.', maxsplit=1)[1]
        actual = '.'.join(map(op.attrgetter('arg'), popped))
        if expected != actual:
            raise DecompilationError(
                "Decompiling import of module %s, but LOAD_ATTRS imply %s" % (
                    expected, actual,
                )
            )
    return popped


@curry
def make_importfrom_alias(queue, body, context, name):
    """
    Make an ast.alias node for the names list of an ast.ImportFrom.

    Parameters
    ----------
    queue : deque
        Instruction Queue
    body : list
        Current body.
    context : DecompilationContext
    name : str
        Expected name of the IMPORT_FROM node to be popped.

    Returns
    -------
    alias : ast.alias

    Side Effects
    ------------
    Consumes IMPORT_FROM and STORE_NAME instructions from queue.
    """
    import_from, store = queue.popleft(), queue.popleft()
    expect(import_from, instrs.IMPORT_FROM, "after IMPORT_NAME")

    if not import_from.arg == name:
        raise DecompilationError(
            "IMPORT_FROM name mismatch. Expected %r, but got %s." % (
                name, import_from,
            )
        )
    return ast.alias(
        name=name,
        asname=store.arg if store.arg != name else None,
    )


@_process_instr.register(instrs.COMPARE_OP)
@_process_instr.register(instrs.UNARY_NOT)
@_process_instr.register(instrs.BINARY_SUBSCR)
@_process_instr.register(instrs.LOAD_ATTR)
@_process_instr.register(instrs.LOAD_GLOBAL)
@_process_instr.register(instrs.LOAD_CONST)
@_process_instr.register(instrs.LOAD_FAST)
@_process_instr.register(instrs.LOAD_NAME)
@_process_instr.register(instrs.LOAD_DEREF)
@_process_instr.register(instrs.LOAD_CLOSURE)
@_process_instr.register(instrs.BUILD_TUPLE)
@_process_instr.register(instrs.BUILD_SET)
@_process_instr.register(instrs.BUILD_LIST)
@_process_instr.register(instrs.BUILD_MAP)
@_process_instr.register(instrs.STORE_MAP)
@_process_instr.register(instrs.CALL_FUNCTION)
@_process_instr.register(instrs.CALL_FUNCTION_VAR)
@_process_instr.register(instrs.CALL_FUNCTION_KW)
@_process_instr.register(instrs.CALL_FUNCTION_VAR_KW)
@_process_instr.register(instrs.BUILD_SLICE)
@_process_instr.register(instrs.JUMP_IF_TRUE_OR_POP)
@_process_instr.register(instrs.JUMP_IF_FALSE_OR_POP)
def _push(instr, queue, stack, body, context):
    """
    Just push these instructions onto the stack for further processing
    downstream.
    """
    stack.append(instr)


@_process_instr.register(instrs.MAKE_FUNCTION)
@_process_instr.register(instrs.MAKE_CLOSURE)
def _make_function(instr, queue, stack, body, context):
    """
    Set a make_function_context, then push onto the stack.
    """
    assert stack, "Empty stack before MAKE_FUNCTION."
    prev = stack[-1]
    expect(prev, instrs.LOAD_CONST, "before MAKE_FUNCTION")

    stack.append(instr)

    if is_lambda_name(prev.arg):
        return

    return context.update(
        make_function_context=MakeFunctionContext(
            closure=isinstance(instr, instrs.MAKE_CLOSURE),
        )
    )


@_process_instr.register(instrs.STORE_FAST)
@_process_instr.register(instrs.STORE_NAME)
@_process_instr.register(instrs.STORE_DEREF)
@_process_instr.register(instrs.STORE_GLOBAL)
def _store(instr, queue, stack, body, context):
    # This is set by MAKE_FUNCTION nodes to register that the next `STORE_NAME`
    # should create a FunctionDef node.
    if context.make_function_context is not None:
        body.append(
            make_function(
                pop_arguments(instr, stack),
                **context.make_function_context.to_dict()
            ),
        )
        return context.update(make_function_context=None)

    body.append(make_assignment(instr, queue, stack))


@_process_instr.register(instrs.DUP_TOP)
def _dup_top(instr, queue, stack, body, context):
    body.append(make_assignment(instr, queue, stack))


def make_assignment(instr, queue, stack):
    """
    Make an ast.Assign node.
    """
    value = make_expr(stack)

    # Make assignment targets.
    # If there are multiple assignments (e.g. 'a = b = c'),
    # each LHS expression except the last is preceded by a DUP_TOP instruction.
    # Thus, we make targets until we don't see a DUP_TOP, and then make one
    # more.
    targets = []
    while isinstance(instr, instrs.DUP_TOP):
        targets.append(make_assign_target(queue.popleft(), queue, stack))
        instr = queue.popleft()

    targets.append(make_assign_target(instr, queue, stack))

    return ast.Assign(targets=targets, value=value)


@singledispatch
def make_assign_target(instr, queue, stack):
    """
    Make an AST node for the LHS of an assignment beginning at `instr`.
    """
    raise DecompilationError("Can't make assignment target for %s." % instr)


@make_assign_target.register(instrs.STORE_FAST)
@make_assign_target.register(instrs.STORE_NAME)
@make_assign_target.register(instrs.STORE_DEREF)
@make_assign_target.register(instrs.STORE_GLOBAL)
def make_assign_target_store(instr, queue, stack):
    return ast.Name(id=instr.arg, ctx=ast.Store())


@make_assign_target.register(instrs.STORE_ATTR)
def make_assign_target_setattr(instr, queue, stack):
    return ast.Attribute(
        value=make_expr(stack),
        attr=instr.arg,
        ctx=ast.Store(),
    )


@make_assign_target.register(instrs.STORE_SUBSCR)
def make_assign_target_setitem(instr, queue, stack):
    slice_ = make_slice(stack)
    collection = make_expr(stack)
    return ast.Subscript(
        value=collection,
        slice=slice_,
        ctx=ast.Store(),
    )


@make_assign_target.register(instrs.UNPACK_SEQUENCE)
def make_assign_target_unpack(instr, queue, stack):
    return ast.Tuple(
        elts=[
            make_assign_target(queue.popleft(), queue, stack)
            for _ in range(instr.arg)
        ],
        ctx=ast.Store(),
    )


@make_assign_target.register(instrs.LOAD_NAME)
@make_assign_target.register(instrs.LOAD_ATTR)
@make_assign_target.register(instrs.BINARY_SUBSCR)
def make_assign_target_load_name(instr, queue, stack):
    # We hit this case when a setattr or setitem is nested in a more complex
    # assignment.  Just push the load onto the stack to be processed by the
    # upcoming STORE_ATTR or STORE_SUBSCR.
    stack.append(instr)
    return make_assign_target(queue.popleft(), queue, stack)


@_process_instr.register(instrs.STORE_ATTR)
@_process_instr.register(instrs.STORE_SUBSCR)
def _store_subscr(instr, queue, stack, body, context):
    target = make_assign_target(instr, queue, stack)
    rhs = make_expr(stack)
    body.append(ast.Assign(targets=[target], value=rhs))


@_process_instr.register(instrs.POP_TOP)
def _pop(instr, queue, stack, body, context):
    body.append(ast.Expr(value=make_expr(stack)))


@_process_instr.register(instrs.RETURN_VALUE)
def _return(instr, queue, stack, body, context):
    if context.in_function_block:
        body.append(ast.Return(value=make_expr(stack)))
    elif context.in_lambda:
        if body:
            raise DecompilationError("Non-empty body in lambda: %s" % body)
        # Just append the raw expr.  We'll extract the raw value in
        # `make_lambda`.
        body.append(make_expr(stack))
    else:
        _check_stack_for_module_return(stack)
        # Pop dummy LOAD_CONST(None) at the end of a module.
        stack.pop()
        return


@_process_instr.register(instrs.BREAK_LOOP)
def _jump_break_loop(instr, queue, stack, body, context):
    if context.top_of_loop is None:
        raise DecompilationError("BREAK_LOOP outside of loop.")
    body.append(ast.Break())


@_process_instr.register(instrs.JUMP_ABSOLUTE)
def _jump_absolute(instr, queue, stack, body, context):
    if instr.arg is context.top_of_loop:
        body.append(ast.Continue())
        return
    raise DecompilationError("Don't know how to decompile %s." % instr)


@_process_instr.register(instrs.SETUP_WITH)
def _process_instr_setup_with(instr, queue, stack, body, context):
    items = [make_withitem(queue, stack)]
    block_body = instrs_to_body(
        pop_with_body_instrs(instr, queue),
        context,
    )

    # Handle compound with statement (e.g. "with a, b").
    if len(block_body) == 1 and isinstance(block_body[0], ast.With):
        nested_with = block_body[0]
        # Merge the inner block's items with our top-level items.
        items += nested_with.items
        # Use the inner block's body as the real body.
        block_body = nested_with.body

    return body.append(
        ast.With(items=items, body=block_body)
    )


def pop_with_body_instrs(setup_with_instr, queue):
    """
    Pop instructions from `queue` that form the body of a with block.
    """
    body_instrs = popwhile(op.is_not(setup_with_instr.arg), queue, side='left')

    # Last two instructions should always be POP_BLOCK, LOAD_CONST(None).
    # These don't correspond to anything in the AST, so remove them here.
    load_none = body_instrs.pop()
    expect(load_none, instrs.LOAD_CONST, "at end of with-block")
    pop_block = body_instrs.pop()
    expect(pop_block, instrs.POP_BLOCK, "at end of with-block")
    if load_none.arg is not None:
        raise DecompilationError(
            "Expected LOAD_CONST(None), but got "
            "%r instead" % (load_none)
        )

    # Target of the setup_with should be a WITH_CLEANUP instruction followed by
    # an END_FINALLY.  Neither of these correspond to anything in the AST.
    with_cleanup = queue.popleft()
    expect(with_cleanup, instrs.WITH_CLEANUP, "at end of with-block")
    end_finally = queue.popleft()
    expect(end_finally, instrs.END_FINALLY, "at end of with-block")

    return body_instrs


def make_withitem(queue, stack):
    """
    Make an ast.withitem node.
    """
    context_expr = make_expr(stack)
    # This is a POP_TOP for just "with <expr>:".
    # This is a STORE_NAME(name) for "with <expr> as <name>:".
    as_instr = queue.popleft()
    if isinstance(as_instr, (instrs.STORE_FAST,
                             instrs.STORE_NAME,
                             instrs.STORE_DEREF,
                             instrs.STORE_GLOBAL)):
        return ast.withitem(
            context_expr=context_expr,
            optional_vars=make_assign_target(as_instr, queue, stack),
        )
    elif isinstance(as_instr, instrs.POP_TOP):
        return ast.withitem(context_expr=context_expr, optional_vars=None)
    else:
        raise DecompilationError(
            "Don't know how to make withitem from %s" % as_instr,
        )


@_process_instr.register(instrs.SETUP_LOOP)
def _loop(instr, queue, stack, body, context):
    loop_type, loop_body, else_body = pop_loop_instrs(instr, queue)
    assert loop_type in ('for', 'while'), "Unknown loop type %r" % loop_type
    if loop_type == 'for':
        body.append(make_for_loop(loop_body, else_body, context))
    elif loop_type == 'while':
        body.append(make_while_loop(loop_body, else_body, context))


def make_for_loop(loop_body_instrs, else_body_instrs, context):
    """
    Make an ast.For node.
    """
    # Instructions from start until GET_ITER are the builders for the iterator
    # expression.
    iterator_expr = make_expr(
        popwhile(not_a(instrs.GET_ITER), loop_body_instrs, side='left')
    )

    # Next is the GET_ITER instruction, which we don't need.
    loop_body_instrs.popleft()

    # Next is FOR_ITER, which is the jump target for Continue nodes.
    top_of_loop = loop_body_instrs.popleft()

    # This can be a STORE_* or an UNPACK_SEQUENCE followed by some number of
    # stores.
    target = make_assign_target(
        loop_body_instrs.popleft(),
        loop_body_instrs,
        stack=[],
    )

    body, orelse_body = make_loop_body_and_orelse(
        top_of_loop, loop_body_instrs, else_body_instrs, context
    )

    return ast.For(
        target=target,
        iter=iterator_expr,
        body=body,
        orelse=orelse_body,
    )


def make_loop_body_and_orelse(top_of_loop, body_instrs, else_instrs, context):
    """
    Make body and orelse lists for a for/while loop whose first instruction is
    `top_of_loop`.

    Parameters
    ----------
    top_of_loop : Instruction
        The first body of the loop.  For a for-loop, this should always be a
        FOR_ITER.  For a while loop, it's the first instruction of the stack
        builders for the loop test expression
    body_instrs : deque
        Queue of Instructions that form the body of the loop.  The last two
        elements of body_instrs should be a JUMP_ABSOLUTE to `top_of_loop` and
        a POP_BLOCK.
    else_instrs : deque
        Queue of Instructions that form the else block of the loop.  Should be
        an empty deque if there is no else block.
    context : DecompilationContext

    Returns
    -------
    body : list[ast.AST]
        List of ast nodes forming the loop body.
    orelse_body : list[ast.AST]
        List of ast nodes forming the else-block body.
    """
    # Remove the JUMP_ABSOLUTE and POP_BLOCK instructions at the bottom of the
    # loop.
    body_instrs.pop()
    body_instrs.pop()
    body = instrs_to_body(body_instrs, context.update(top_of_loop=top_of_loop))

    if else_instrs:
        else_body = instrs_to_body(else_instrs, context)
    else:
        else_body = []

    return body, else_body


def make_while_loop(test_and_body_instrs, else_body_instrs, context):
    """
    Make an ast.While node.

    Parameters
    ----------
    test_and_body_instrs : deque
        Queue of instructions forming the loop test expression and body.
    else_body_instrs : deque
        Queue of instructions forming the else block of the loop.
    context : DecompilationContext
    """
    top_of_loop = test_and_body_instrs[0]

    # The popped elements are the stack_builders for the loop test expression.
    # The top of the loop_body_instrs is either a POP_JUMP_IF_TRUE or a
    # POP_JUMP_IF_FALSE.
    test, body_instrs = make_while_loop_test_expr(test_and_body_instrs)
    body, orelse_body = make_loop_body_and_orelse(
        top_of_loop, body_instrs, else_body_instrs, context,
    )

    # while-else blocks are not yet supported or handled.
    return ast.While(test=test, body=body, orelse=orelse_body)


def make_while_loop_test_expr(loop_body_instrs):
    """
    Make an expression in the context of a while-loop test.

    Code of the form::

        while <expr>:
            <body>

    generates a POP_JUMP_IF_FALSE for the loop test, while code of the form::

        while not <expr>:
            <body>

    generates a POP_JUMP_IF_TRUE for the loop test.

    Code of the form::

        while True:
            <body>

    generates no jumps at all.
    """
    bottom_of_loop = loop_body_instrs[-1]
    is_jump_to_bottom = compose(op.is_(bottom_of_loop), op.attrgetter('arg'))

    # Consume instructions until we find a jump to the bottom of the loop.
    test_builders = deque(
        popwhile(complement(is_jump_to_bottom), loop_body_instrs, side='left')
    )
    # If we consumed the entire loop body without finding a jump, assume this
    # is a while True loop.  Return the rest of the instructions as the loop
    # body.
    if not loop_body_instrs:
        return ast.NameConstant(value=True), test_builders

    # Top of the body is either a POP_JUMP_IF_TRUE or POP_JUMP_IF_FALSE.
    jump = loop_body_instrs.popleft()
    expr = make_expr(test_builders)
    if isinstance(jump, instrs.POP_JUMP_IF_TRUE):
        return ast.UnaryOp(op=ast.Not(), operand=expr), loop_body_instrs
    else:
        return expr, loop_body_instrs


def pop_loop_instrs(setup_loop_instr, queue):
    """
    Determine whether setup_loop_instr is setting up a for-loop or a
    while-loop.  Then pop the loop instructions from queue.

    The easiest way to tell the difference is to look at the target of the
    JUMP_ABSOLUTE instruction at the end of the loop.  If it jumps to a
    FOR_ITER, then this is a for-loop.  Otherwise it's a while-loop.

    The jump we want to inspect is the first JUMP_ABSOLUTE instruction prior to
    the jump target of `setup_loop_instr`.

    Parameters
    ----------
    setup_loop_instr : instructions.SETUP_LOOP
        First instruction of the loop being parsed.
    queue : collections.deque
        Queue of unprocessed instructions.

    Returns
    -------
    loop_type : str, {'for', 'while'}
        The kind of loop being constructed.
    loop_instrs : deque
        The instructions forming body of the loop.
    else_instrs : deque
        The instructions forming the else-block of the loop.

    Side Effects
    ------------
    Pops all returned instructions from `queue`.
    """
    # Grab everything from left side of the queue until the jump target of
    # SETUP_LOOP.
    body = popwhile(op.is_not(setup_loop_instr.arg), queue, side='left')

    # Anything after the last POP_BLOCK instruction is the else-block.
    else_body = popwhile(not_a(instrs.POP_BLOCK), body, side='right')

    jump_to_top, pop_block = body[-2], body[-1]
    if not isinstance(jump_to_top, instrs.JUMP_ABSOLUTE):
        raise DecompilationError(
            "Penultimate instruction of loop body is "
            "%s, not JUMP_ABSOLUTE." % jump_to_top,
        )

    if not isinstance(pop_block, instrs.POP_BLOCK):
        raise DecompilationError(
            "Last instruction of loop body is "
            "%s, not pop_block." % pop_block,
        )

    loop_expr = jump_to_top.arg
    if isinstance(loop_expr, instrs.FOR_ITER):
        return 'for', body, else_body
    return 'while', body, else_body


def make_expr(stack_builders):
    """
    Convert a sequence of instructions into AST expressions.
    """
    return _make_expr(stack_builders.pop(), stack_builders)


_BOOLOP_JUMP_TO_AST_OP = {
    instrs.JUMP_IF_TRUE_OR_POP: ast.Or,
    instrs.JUMP_IF_FALSE_OR_POP: ast.And,
}
_BOOLOP_JUMP_TYPES = tuple(_BOOLOP_JUMP_TO_AST_OP)


def _make_expr(toplevel, stack_builders):
    """
    Override the single-dispatched make_expr with wrapper logic for handling
    short-circuiting expressions.
    """
    base_expr = _make_expr_internal(toplevel, stack_builders)
    if not toplevel._next_target_of:
        return base_expr

    subexprs = deque([base_expr])
    ops = deque([])
    while stack_builders and stack_builders[-1] in toplevel._next_target_of:
        jump = stack_builders.pop()
        if not isinstance(jump, _BOOLOP_JUMP_TYPES):
            raise DecompilationError(
                "Don't know how to decompile %s inside expression." % jump,
            )
        subexprs.appendleft(make_expr(stack_builders))
        ops.appendleft(_BOOLOP_JUMP_TO_AST_OP[type(jump)]())

    if len(subexprs) <= 1:
        raise DecompilationError(
            "Expected at least one JUMP instruction before expression."
        )

    return normalize_boolop(make_boolop(subexprs, ops))


def make_boolop(exprs, op_types):
    """
    Parameters
    ----------
    exprs : deque
    op_types : deque[{ast.And, ast.Or}]
    """
    if len(op_types) > 1:
        return ast.BoolOp(
            op=op_types.popleft(),
            values=[exprs.popleft(), make_boolop(exprs, op_types)],
        )

    assert len(exprs) == 2
    return ast.BoolOp(op=op_types.popleft(), values=list(exprs))


def normalize_boolop(expr):
    """
    Normalize a boolop by folding together nested And/Or exprs.
    """
    optype = expr.op
    newvalues = []
    for subexpr in expr.values:
        if not isinstance(subexpr, ast.BoolOp):
            newvalues.append(subexpr)
        elif type(subexpr.op) != type(optype):
            newvalues.append(normalize_boolop(subexpr))
        else:
            # Normalize subexpression, then inline its values into the
            # top-level subexpr.
            newvalues.extend(normalize_boolop(subexpr).values)
    return ast.BoolOp(op=optype, values=newvalues)


@singledispatch
def _make_expr_internal(toplevel, stack_builders):
    raise DecompilationError(
        "Don't know how to build expression for %s" % toplevel
    )


@_make_expr_internal.register(instrs.MAKE_FUNCTION)
@_make_expr_internal.register(instrs.MAKE_CLOSURE)
def _make_lambda(toplevel, stack_builders):
    load_name = stack_builders.pop()
    load_code = stack_builders.pop()
    _check_make_function_instrs(
        load_code,
        load_name,
        toplevel,
        expect_lambda=True,
    )

    co = load_code.arg
    args, kwonly, varargs, varkwargs = paramnames(co)
    defaults, kw_defaults, annotations = make_defaults_and_annotations(
        toplevel,
        stack_builders,
    )
    if annotations:
        raise DecompilationError(
            "Unexpected annotations while building lambda: %s" % annotations
        )

    if isinstance(toplevel, instrs.MAKE_CLOSURE):
        # There should be a tuple of closure cells still on the stack here.
        # These don't appear in the AST, but we need to consume them to ensure
        # correctness down the line.
        _closure_cells = make_closure_cells(stack_builders)  # noqa

    body = pycode_to_body(co, DecompilationContext(in_lambda=True))
    if len(body) != 1:
        raise DecompilationError(
            "Got multiple expresssions for lambda: %s" % body,
        )
    body = body[0]

    return ast.Lambda(
        args=make_function_arguments(
            args,
            kwonly,
            varargs,
            varkwargs,
            defaults,
            kw_defaults,
            annotations,
        ),
        body=body,
    )


@_make_expr_internal.register(instrs.UNARY_NOT)
def _make_expr_unary_not(toplevel, stack_builders):
    return ast.UnaryOp(
        op=ast.Not(),
        operand=make_expr(stack_builders),
    )


@_make_expr_internal.register(instrs.CALL_FUNCTION)
def _make_expr_call_function(toplevel, stack_builders):
    keywords = make_call_keywords(stack_builders, toplevel.keyword)
    positionals = make_call_positionals(stack_builders, toplevel.positional)
    return ast.Call(
        func=make_expr(stack_builders),
        args=positionals,
        keywords=keywords,
        starargs=None,
        kwargs=None,
    )


@_make_expr_internal.register(instrs.CALL_FUNCTION_VAR)
def _make_expr_call_function_var(toplevel, stack_builders):
    starargs = make_expr(stack_builders)
    keywords = make_call_keywords(stack_builders, toplevel.keyword)
    positionals = make_call_positionals(stack_builders, toplevel.positional)
    return ast.Call(
        func=make_expr(stack_builders),
        args=positionals,
        keywords=keywords,
        starargs=starargs,
        kwargs=None,
    )


@_make_expr_internal.register(instrs.CALL_FUNCTION_KW)
def _make_expr_call_function_kw(toplevel, stack_builders):
    kwargs = make_expr(stack_builders)
    keywords = make_call_keywords(stack_builders, toplevel.keyword)
    positionals = make_call_positionals(stack_builders, toplevel.positional)
    return ast.Call(
        func=make_expr(stack_builders),
        args=positionals,
        keywords=keywords,
        starargs=None,
        kwargs=kwargs,
    )


@_make_expr_internal.register(instrs.CALL_FUNCTION_VAR_KW)
def _make_expr_call_function_var_kw(toplevel, stack_builders):
    kwargs = make_expr(stack_builders)
    starargs = make_expr(stack_builders)
    keywords = make_call_keywords(stack_builders, toplevel.keyword)
    positionals = make_call_positionals(stack_builders, toplevel.positional)
    return ast.Call(
        func=make_expr(stack_builders),
        args=positionals,
        keywords=keywords,
        starargs=starargs,
        kwargs=kwargs,
    )


def make_call_keywords(stack_builders, count):
    """
    Make the keywords entry for an ast.Call node.
    """
    out = []
    for _ in range(count):
        value = make_expr(stack_builders)
        load_kwname = stack_builders.pop()
        if not isinstance(load_kwname, instrs.LOAD_CONST):
            raise DecompilationError(
                "Expected a LOAD_CONST, but got %r" % load_kwname
            )
        if not isinstance(load_kwname.arg, str):
            raise DecompilationError(
                "Expected LOAD_CONST of a str, but got %r." % load_kwname,
            )
        out.append(ast.keyword(arg=load_kwname.arg, value=value))
    out.reverse()
    return out


def make_call_positionals(stack_builders, count):
    """
    Make the args entry for an ast.Call node.
    """
    out = [make_expr(stack_builders) for _ in range(count)]
    out.reverse()
    return out


@_make_expr_internal.register(instrs.BUILD_TUPLE)
def _make_expr_tuple(toplevel, stack_builders):
    return ast.Tuple(
        ctx=ast.Load(),
        elts=make_exprs(stack_builders, toplevel.arg),
    )


@_make_expr_internal.register(instrs.BUILD_SET)
def _make_expr_set(toplevel, stack_builders):
    return ast.Set(
        ctx=ast.Load(),
        elts=make_exprs(stack_builders, toplevel.arg),
    )


@_make_expr_internal.register(instrs.BUILD_LIST)
def _make_expr_list(toplevel, stack_builders):
    return ast.List(
        ctx=ast.Load(),
        elts=make_exprs(stack_builders, toplevel.arg),
    )


def make_exprs(stack_builders, count):
    """
    Make elements of set/list/tuple literal.
    """
    exprs = [make_expr(stack_builders) for _ in range(count)]
    # Elements are on the stack from right to left, but we want them from right
    # to left.
    exprs.reverse()
    return exprs


@_make_expr_internal.register(instrs.BUILD_MAP)
def _make_expr_empty_dict(toplevel, stack_builders):
    """
    This should only be hit for empty dicts.  Anything else should hit the
    STORE_MAP handler instead.
    """
    if toplevel.arg:
        raise DecompilationError(
            "make_expr() called with nonzero BUILD_MAP arg %d" % toplevel.arg
        )

    if stack_builders:
        raise DecompilationError(
            "Unexpected stack_builders for BUILD_MAP(0): %s" % stack_builders
        )
    return ast.Dict(keys=[], values=[])


@_make_expr_internal.register(instrs.STORE_MAP)
def _make_expr_dict(toplevel, stack_builders):

    # Push toplevel back onto the stack so that it gets correctly consumed by
    # `_make_dict_elems`.
    stack_builders.append(toplevel)

    build_map = find_build_map(stack_builders)
    dict_builders = popwhile(
        op.is_not(build_map), stack_builders, side='right'
    )

    # Consume the BUILD_MAP instruction.
    _build_map = stack_builders.pop()
    assert _build_map is build_map

    keys, values = _make_dict_elems(build_map, dict_builders)
    return ast.Dict(keys=keys, values=values)


def find_build_map(stack_builders):
    """
    Find the BUILD_MAP instruction for which the last element of
    ``stack_builders`` is a store.
    """
    assert isinstance(stack_builders[-1], instrs.STORE_MAP)

    to_consume = 0
    for instr in reversed(stack_builders):
        if isinstance(instr, instrs.STORE_MAP):
            # NOTE: This branch should always be hit on the first iteration.
            to_consume += 1
        elif isinstance(instr, instrs.BUILD_MAP):
            to_consume -= instr.arg
            if to_consume <= 0:
                return instr
    else:
        raise DecompilationError(
            "Couldn't find BUILD_MAP for last element of %s." % stack_builders
        )


def _make_dict_elems(build_instr, builders):
    """
    Return a list of keys and a list of values for the dictionary literal
    generated by ``build_instr``.
    """
    keys = []
    values = []
    for _ in range(build_instr.arg):
        popped = builders.pop()
        if not isinstance(popped, instrs.STORE_MAP):
            raise DecompilationError(
                "Expected a STORE_MAP but got %s" % popped
            )

        keys.append(make_expr(builders))
        values.append(make_expr(builders))

    # Keys and values are emitted in reverse order of how they appear in the
    # AST.
    keys.reverse()
    values.reverse()
    return keys, values


@_make_expr_internal.register(instrs.LOAD_DEREF)
@_make_expr_internal.register(instrs.LOAD_NAME)
@_make_expr_internal.register(instrs.LOAD_CLOSURE)
@_make_expr_internal.register(instrs.LOAD_FAST)
@_make_expr_internal.register(instrs.LOAD_GLOBAL)
def _make_expr_name(toplevel, stack_builders):
    return ast.Name(id=toplevel.arg, ctx=ast.Load())


@_make_expr_internal.register(instrs.LOAD_ATTR)
def _make_expr_attr(toplevel, stack_builders):
    return ast.Attribute(
        value=make_expr(stack_builders),
        attr=toplevel.arg,
        ctx=ast.Load(),
    )


@_make_expr_internal.register(instrs.BINARY_SUBSCR)
def _make_expr_getitem(toplevel, stack_builders):
    slice_ = make_slice(stack_builders)
    value = make_expr(stack_builders)
    return ast.Subscript(slice=slice_, value=value, ctx=ast.Load())


def make_slice(stack_builders):
    """
    Make an expression in the context of a slice.

    This mostly delegates to _make_expr, but wraps nodes in `ast.Index` or
    `ast.Slice` as appropriate.
    """
    return _make_slice(stack_builders.pop(), stack_builders)


@singledispatch
def _make_slice(toplevel, stack_builders):
    return ast.Index(_make_expr(toplevel, stack_builders))


@_make_slice.register(instrs.BUILD_SLICE)
def make_slice_build_slice(toplevel, stack_builders):
    return _make_expr(toplevel, stack_builders)


@_make_slice.register(instrs.BUILD_TUPLE)
def make_slice_tuple(toplevel, stack_builders):
    slice_ = _make_expr(toplevel, stack_builders)
    if isinstance(slice_, ast.Tuple):
        # a = b[c, d] generates Index(value=Tuple(...))
        # a = b[c:, d] generates ExtSlice(dims=[Slice(...), Index(...)])
        slice_ = normalize_tuple_slice(slice_)
    return slice_


def normalize_tuple_slice(node):
    """
    Normalize an ast.Tuple node representing the internals of a slice.

    Returns the node wrapped in an ast.Index.
    Returns an ExtSlice node built from the tuple elements if there are any
    slices.
    """
    if not any(isinstance(elt, ast.Slice) for elt in node.elts):
        return ast.Index(value=node)

    return ast.ExtSlice(
        [
            # Wrap non-Slice nodes in Index nodes.
            elt if isinstance(elt, ast.Slice) else ast.Index(value=elt)
            for elt in node.elts
        ]
    )


@_make_expr_internal.register(instrs.BUILD_SLICE)
def _make_expr_build_slice(toplevel, stack_builders):
    # Arg is always either 2 or 3.  If it's 3, then the first expression is the
    # step value.
    if toplevel.arg == 3:
        step = make_expr(stack_builders)
    else:
        step = None

    def normalize_empty_slice(node):
        """
        Convert LOAD_CONST(None) to just None.

        This normalizes slices of the form a[b:None] to just a[b:].
        """
        if isinstance(node, ast.NameConstant) and node.value is None:
            return None
        return node

    upper = normalize_empty_slice(make_expr(stack_builders))
    lower = normalize_empty_slice(make_expr(stack_builders))

    return ast.Slice(lower=lower, upper=upper, step=step)


@_make_expr_internal.register(instrs.LOAD_CONST)
def _make_expr_const(toplevel, stack_builders):
    return _make_const(toplevel.arg)


@singledispatch
def _make_const(const):
    raise DecompilationError(
        "Don't know how to make constant node for %r." % (const,)
    )


@_make_const.register(float)
@_make_const.register(complex)
@_make_const.register(int)
def _make_const_number(const):
    return ast.Num(n=const)


@_make_const.register(str)
def _make_const_str(const):
    return ast.Str(s=const)


@_make_const.register(bytes)
def _make_const_bytes(const):
    return ast.Bytes(s=const)


@_make_const.register(tuple)
def _make_const_tuple(const):
    return ast.Tuple(elts=list(map(_make_const, const)), ctx=ast.Load())


@_make_const.register(type(None))
def _make_const_none(none):
    return ast.NameConstant(value=None)


binops = frozenset([
    (instrs.BINARY_ADD, ast.Add),
    (instrs.BINARY_SUBTRACT, ast.Sub),
    (instrs.BINARY_MULTIPLY, ast.Mult),
    (instrs.BINARY_POWER, ast.Pow),
    (instrs.BINARY_TRUE_DIVIDE, ast.Div),
    (instrs.BINARY_FLOOR_DIVIDE, ast.FloorDiv),
    (instrs.BINARY_MODULO, ast.Mod),
    (instrs.BINARY_LSHIFT, ast.LShift),
    (instrs.BINARY_RSHIFT, ast.RShift),
    (instrs.BINARY_AND, ast.BitAnd),
    (instrs.BINARY_XOR, ast.BitXor),
    (instrs.BINARY_OR, ast.BitOr),
])


def _binop_handler(nodetype):
    """
    Factory function for binary operator handlers.
    """
    def _handler(toplevel, stack_builders):
        right = make_expr(stack_builders)
        left = make_expr(stack_builders)
        return ast.BinOp(left=left, op=nodetype(), right=right)
    return _handler


for instrtype, nodetype in binops:
    _process_instr.register(instrtype)(_push)
    _make_expr_internal.register(instrtype)(_binop_handler(nodetype))


def make_function(function_builders, *, closure):
    """
    Construct a FunctionDef AST node from a sequence of the form:

    LOAD_CLOSURE, N times (when handling MAKE_CLOSURE)
    BUILD_TUPLE(N) (when handling MAKE_CLOSURE)
    <decorator builders> (optional)
    <default builders>, (optional)
    <annotation builders> (optional)
    LOAD_CONST(<tuple of annotated names>) (optional)
    LOAD_CONST(code),
    LOAD_CONST(name),
    MAKE_FUNCTION | MAKE_CLOSURE
    <decorator calls> (optional)
    """
    decorator_calls = deque()
    while isinstance(function_builders[-1], instrs.CALL_FUNCTION):
        decorator_calls.appendleft(function_builders.pop())

    *builders, load_code_instr, load_name_instr, make_function_instr = (
        function_builders
    )

    _check_make_function_instrs(
        load_code_instr, load_name_instr, make_function_instr,
    )

    co = load_code_instr.arg
    name = load_name_instr.arg
    args, kwonly, varargs, varkwargs = paramnames(co)

    # Convert default and annotation builders to AST nodes.
    defaults, kw_defaults, annotations = make_defaults_and_annotations(
        make_function_instr,
        builders,
    )

    # Convert decorator function builders.  The stack is in reverse order.
    decorators = [make_expr(builders) for _ in decorator_calls]
    decorators.reverse()

    if closure:
        # There should be a tuple of closure cells still on the stack here.
        # These don't appear in the AST, but we need to consume them to ensure
        # correctness down the line.
        closure_cells = make_closure_cells(builders)  # noqa

    # We should have consumed all our builders by this point.
    if builders:
        raise DecompilationError(
            "Unexpected leftover builders for %s: %s." % (
                make_function_instr, builders
            )
        )

    return ast.FunctionDef(
        body_code=co,
        name=name.split('.')[-1],
        args=make_function_arguments(
            args,
            kwonly,
            varargs,
            varkwargs,
            defaults,
            kw_defaults,
            annotations,
        ),
        body=pycode_to_body(co, DecompilationContext(in_function_block=True)),
        decorator_list=decorators,
        returns=annotations.get('return'),
    )


def make_function_arguments(args,
                            kwonly,
                            varargs,
                            varkwargs,
                            defaults,
                            kw_defaults,
                            annotations):
    """
    Make an ast.arguments from the args parsed out of a code object.
    """
    return ast.arguments(
        args=[ast.arg(arg=a, annotation=annotations.get(a)) for a in args],
        kwonlyargs=[
            ast.arg(arg=a, annotation=annotations.get(a)) for a in kwonly
        ],
        defaults=defaults,
        kw_defaults=list(map(kw_defaults.get, kwonly)),
        vararg=None if varargs is None else ast.arg(
            arg=varargs, annotation=annotations.get(varargs),
        ),
        kwarg=None if varkwargs is None else ast.arg(
            arg=varkwargs, annotation=annotations.get(varkwargs)
        ),
    )


def make_closure_cells(stack_builders):
    cells = make_expr(stack_builders)
    if not isinstance(cells, ast.Tuple):
        raise DecompilationError(
            "Expected an ast.Tuple of closure cells, "
            "but got %s" % cells,
        )
    return cells


def make_global_and_nonlocal_decls(code_instrs):
    """
    Find all STORE_GLOBAL and STORE_DEREF instructions in `instrs` and convert
    them into a canonical list of `ast.Global` and `ast.Nonlocal` declarations.
    """
    globals_ = sorted(set(
        i.arg for i in code_instrs if isinstance(i, instrs.STORE_GLOBAL)
    ))
    nonlocals = sorted(set(
        i.arg for i in code_instrs
        if isinstance(i, instrs.STORE_DEREF) and i.vartype == 'free'
    ))

    out = []
    if globals_:
        out.append(ast.Global(names=globals_))
    if nonlocals:
        out.append(ast.Nonlocal(names=nonlocals))
    return out


def make_defaults_and_annotations(make_function_instr, builders):
    """
    Get the AST expressions corresponding to the defaults, kwonly defaults, and
    annotations for a function created by `make_function_instr`.
    """
    # Integer counts.
    n_defaults, n_kwonlydefaults, n_annotations = unpack_make_function_arg(
        make_function_instr.arg
    )
    if n_annotations:
        # TOS should be a tuple of annotation names.
        load_annotation_names = builders.pop()
        annotations = dict(zip(
            reversed(load_annotation_names.arg),
            (make_expr(builders) for _ in range(n_annotations - 1))
        ))
    else:
        annotations = {}

    kwonlys = {}
    while n_kwonlydefaults:
        default_expr = make_expr(builders)
        key_instr = builders.pop()
        if not isinstance(key_instr, instrs.LOAD_CONST):
            raise DecompilationError(
                "kwonlydefault key is not a LOAD_CONST: %s" % key_instr
            )
        if not isinstance(key_instr.arg, str):
            raise DecompilationError(
                "kwonlydefault key builder is not a "
                "'LOAD_CONST of a string: %s" % key_instr
            )

        kwonlys[key_instr.arg] = default_expr
        n_kwonlydefaults -= 1

    defaults = make_exprs(builders, n_defaults)
    return defaults, kwonlys, annotations


def unpack_make_function_arg(arg):
    """
    Unpack the argument to a MAKE_FUNCTION instruction.

    Parameters
    ----------
    arg : int
        The argument to a MAKE_FUNCTION instruction.

    Returns
    -------
    num_defaults, num_kwonly_default_pairs, num_annotations

    See Also
    --------
    https://docs.python.org/3/library/dis.html#opcode-MAKE_FUNCTION
    """
    return arg & 0xFF, (arg >> 8) & 0xFF, (arg >> 16) & 0x7FFF


def _check_make_function_instrs(load_code_instr,
                                load_name_instr,
                                make_function_instr,
                                *,
                                expect_lambda=False):
    """
    Validate the instructions passed to a make_function call.
    """

    # Validate load_code_instr.
    if not isinstance(load_code_instr, instrs.LOAD_CONST):
        raise TypeError(
            "make_function expected 'load_code_instr` to be a "
            "LOAD_CONST, but got %s" % load_code_instr,
        )
    if not isinstance(load_code_instr.arg, types.CodeType):
        raise TypeError(
            "make_function expected load_code_instr "
            "to load a code object, but got %s" % load_code_instr.arg,
        )

    # Validate load_name_instr
    if not isinstance(load_name_instr, instrs.LOAD_CONST):
        raise TypeError(
            "make_function expected 'load_name_instr` to be a "
            "LOAD_CONST, but got %s" % load_code_instr,
        )

    if not isinstance(load_name_instr.arg, str):
        raise TypeError(
            "make_function expected load_name_instr "
            "to load a string, but got %r instead" % load_name_instr.arg
        )

    # This is an endswith rather than '==' because the arg is the
    # fully-qualified name.
    is_lambda = is_lambda_name(load_name_instr.arg)
    if expect_lambda and not is_lambda:
        raise ValueError(
            "Expected to make a function named <lambda>, but "
            "got %r instead." % load_name_instr.arg
        )
    if not expect_lambda and is_lambda:
        raise ValueError("Unexpectedly received lambda function.")

    # Validate make_function_instr
    if not isinstance(make_function_instr, (instrs.MAKE_FUNCTION,
                                            instrs.MAKE_CLOSURE)):
        raise TypeError(
            "make_function expected a MAKE_FUNCTION or MAKE_CLOSURE"
            "instruction, but got %s instead." % make_function_instr
        )


def pop_arguments(instr, stack):
    """
    Pop instructions off `stack` until we pop all instructions that will
    produce values popped by `instr`.
    """
    needed = instr.stack_effect
    if needed >= 0:
        raise DecompilationError(
            "%s is does not have a negative stack effect" % instr
        )

    for popcount, to_pop in enumerate(reversed(stack), start=1):
        needed += to_pop.stack_effect
        if not needed:
            break
    else:
        raise DecompilationError(
            "Reached end of stack without finding inputs to %s" % instr,
        )

    popped = stack[-popcount:]
    stack[:] = stack[:-popcount]

    return popped


def _check_stack_for_module_return(stack):
    """
    Verify that the stack is in the expected state before the dummy
    RETURN_VALUE instruction of a module or class.
    """
    fail = (
        len(stack) != 1
        or not isinstance(stack[0], instrs.LOAD_CONST)
        or stack[0].arg is not None
    )

    if fail:
        raise DecompilationError(
            "Reached end of non-function code "
            "block with unexpected stack: %s." % stack
        )


def expect(instr, expected, context):
    """
    Check that an instruction is of the expected type.
    """
    if not isinstance(instr, expected):
        raise DecompilationError(
            "Expected a {expected} instruction {context}. Got {instr}.".format(
                instr=instr, expected=expected, context=context,
            )
        )
    return instr


def is_lambda_name(name):
    """
    Check if `name` is the name of lambda function.
    """
    return name.endswith('<lambda>')


def popwhile(cond, queue, *, side):
    """
    Pop elements off a queue while `cond(nextelem)` is True.

    Parameters
    ----------
    cond : predicate
    queue : deque
    side : {'left', 'right'}

    Returns
    -------
    popped : deque

    Examples
    --------
    >>> from collections import deque
    >>> d = deque([1, 2, 3, 2, 1])
    >>> popwhile(lambda x: x < 3, d, side='left')
    deque([1, 2])
    >>> d
    deque([3, 2, 1])
    >>> popwhile(lambda x: x < 3, d, side='right')
    deque([2, 1])
    >>> d
    deque([3])
    """
    if side not in ('left', 'right'):
        raise ValueError("`side` must be one of 'left' or 'right'")

    out = deque()

    if side == 'left':
        popnext = queue.popleft
        pushnext = out.append
        nextidx = 0
    else:
        popnext = queue.pop
        pushnext = out.appendleft
        nextidx = -1

    while queue:
        if not cond(queue[nextidx]):
            break
        pushnext(popnext())
    return out


def _current_test():
    """
    Get the string passed to the currently running call to
    `test_decompiler.check.`

    This is intended for use in debugging tests.  It should never be called in
    real code.
    """
    from codetransformer.tests.test_decompiler import _current_test as ct
    return ct


================================================
FILE: codetransformer/decompiler/__init__.py
================================================
import sys

from ..code import Flag


def paramnames(co):
    """
    Get the parameter names from a pycode object.

    Returns a 4-tuple of (args, kwonlyargs, varargs, varkwargs).
    varargs and varkwargs will be None if the function doesn't take *args or
    **kwargs, respectively.
    """
    flags = co.co_flags
    varnames = co.co_varnames

    argcount, kwonlyargcount = co.co_argcount, co.co_kwonlyargcount
    total = argcount + kwonlyargcount

    args = varnames[:argcount]
    kwonlyargs = varnames[argcount:total]
    varargs, varkwargs = None, None
    if flags & Flag.CO_VARARGS:
        varargs = varnames[total]
        total += 1
    if flags & Flag.CO_VARKEYWORDS:
        varkwargs = varnames[total]

    return args, kwonlyargs, varargs, varkwargs


if sys.version_info[:3] == (3, 4, 3):
    from ._343 import *  # noqa


================================================
FILE: codetransformer/instructions.py
================================================
from abc import ABCMeta, abstractmethod
from dis import opname, opmap, hasjabs, hasjrel, HAVE_ARGUMENT, stack_effect
from enum import (
    IntEnum,
    unique,
)
from operator import attrgetter
from re import escape

from .patterns import matchable
from .utils.immutable import immutableattr
from .utils.no_default import no_default


__all__ = ['Instruction'] + sorted(list(opmap))

# The instructions that use the co_names tuple.
_uses_name = frozenset({
    'DELETE_ATTR',
    'DELETE_GLOBAL',
    'DELETE_NAME',
    'IMPORT_FROM',
    'IMPORT_NAME',
    'LOAD_ATTR',
    'LOAD_GLOBAL',
    'LOAD_NAME',
    'STORE_ATTR',
    'STORE_GLOBAL',
    'STORE_NAME',
})
# The instructions that use the co_varnames tuple.
_uses_varname = frozenset({
    'LOAD_FAST',
    'STORE_FAST',
    'DELETE_FAST',
})
# The instructions that use the co_freevars tuple.
_uses_free = frozenset({
    'DELETE_DEREF',
    'LOAD_CLASSDEREF',
    'LOAD_CLOSURE',
    'LOAD_DEREF',
    'STORE_DEREF',
})


def _notimplemented(name):
    @property
    @abstractmethod
    def _(self):
        raise NotImplementedError(name)
    return _


@property
def _vartype(self):
    try:
        return self._vartype
    except AttributeError:
        raise AttributeError(
            "vartype is not available on instructions "
            "constructed outside of a Code object."
        )


class InstructionMeta(ABCMeta, matchable):
    _marker = object()  # sentinel
    _type_cache = {}

    def __init__(self, *args, opcode=None):
        return super().__init__(*args)

    def __new__(mcls, name, bases, dict_, *, opcode=None):
        try:
            return mcls._type_cache[opcode]
        except KeyError:
            pass

        if len(bases) != 1:
            raise TypeError(
                '{} does not support multiple inheritance'.format(
                    mcls.__name__,
                ),
            )

        if bases[0] is mcls._marker:
            dict_['_reprname'] = immutableattr(name)
            for attr in ('absjmp', 'have_arg', 'opcode', 'opname', 'reljmp'):
                dict_[attr] = _notimplemented(attr)
            return super().__new__(mcls, name, (object,), dict_)

        if opcode not in opmap.values():
            raise TypeError('Invalid opcode: {}'.format(opcode))

        opname_ = opname[opcode]
        dict_['opname'] = dict_['_reprname'] = immutableattr(opname_)
        dict_['opcode'] = immutableattr(opcode)

        absjmp = opcode in hasjabs
        reljmp = opcode in hasjrel
        dict_['absjmp'] = immutableattr(absjmp)
        dict_['reljmp'] = immutableattr(reljmp)
        dict_['is_jmp'] = immutableattr(absjmp or reljmp)

        dict_['uses_name'] = immutableattr(opname_ in _uses_name)
        dict_['uses_varname'] = immutableattr(opname_ in _uses_varname)
        dict_['uses_free'] = immutableattr(opname_ in _uses_free)
        if opname_ in _uses_free:
            dict_['vartype'] = _vartype

        dict_['have_arg'] = immutableattr(opcode >= HAVE_ARGUMENT)

        cls = mcls._type_cache[opcode] = super().__new__(
            mcls, opname[opcode], bases, dict_,
        )
        return cls

    def mcompile(self):
        return escape(bytes((self.opcode,)))

    def __repr__(self):
        return self._reprname
    __str__ = __repr__


class Instruction(InstructionMeta._marker, metaclass=InstructionMeta):
    """
    Base class for all instruction types.

    Parameters
    ----------
    arg : any, optional

        The argument for the instruction. This should be the actual value of
        the argument, for example, if this is a
        :class:`~codetransformer.instructions.LOAD_CONST`, use the constant
        value, not the index that would appear in the bytecode.
    """
    _no_arg = no_default

    def __init__(self, arg=_no_arg):
        if self.have_arg and arg is self._no_arg:
            raise TypeError(
                "{} missing 1 required argument: 'arg'".format(self.opname),
            )
        self.arg = self._normalize_arg(arg)
        self._target_of = set()
        self._stolen_by = None  # used for lnotab recalculation

    def __repr__(self):
        arg = self.arg
        return '{op}{arg}'.format(
            op=self.opname,
            arg='(%r)' % arg if self.arg is not self._no_arg else '',
        )

    @staticmethod
    def _normalize_arg(arg):
        return arg

    def steal(self, instr):
        """Steal the jump index off of `instr`.

        This makes anything that would have jumped to `instr` jump to
        this Instruction instead.

        Parameters
        ----------
        instr : Instruction
            The instruction to steal the jump sources from.

        Returns
        -------
        self : Instruction
            The instruction that owns this method.

        Notes
        -----
        This mutates self and ``instr`` inplace.
        """
        instr._stolen_by = self
        for jmp in instr._target_of:
            jmp.arg = self
        self._target_of = instr._target_of
        instr._target_of = set()
        return self

    @classmethod
    def from_opcode(cls, opcode, arg=_no_arg):
        """
        Create an instruction from an opcode and raw argument.

        Parameters
        ----------
        opcode : int
            Opcode for the instruction to create.
        arg : int, optional
            The argument for the instruction.

        Returns
        -------
        intsr : Instruction
            An instance of the instruction named by ``opcode``.
        """
        return type(cls)(opname[opcode], (cls,), {}, opcode=opcode)(arg)

    @property
    def stack_effect(self):
        """
        The net effect of executing this instruction on the interpreter stack.

        Instructions that pop values off the stack have negative stack effect
        equal to the number of popped values.

        Instructions that push values onto the stack have positive stack effect
        equal to the number of popped values.

        Examples
        --------
        - LOAD_{FAST,NAME,GLOBAL,DEREF} push one value onto the stack.
          They have a stack_effect of 1.
        - POP_JUMP_IF_{TRUE,FALSE} always pop one value off the stack.
          They have a stack effect of -1.
        - BINARY_* instructions pop two instructions off the stack, apply a
          binary operator, and push the resulting value onto the stack.
          They have a stack effect of -1 (-2 values consumed + 1 value pushed).
        """
        if self.opcode == NOP.opcode:  # noqa
            # dis.stack_effect is broken here
            return 0

        return stack_effect(
            self.opcode,
            *((self.arg if isinstance(self.arg, int) else 0,)
              if self.have_arg else ())
        )

    def equiv(self, instr):
        """Check equivalence of instructions. This checks against the types
        and the arguments of the instructions

        Parameters
        ----------
        instr : Instruction
            The instruction to check against.

        Returns
        -------
        is_equiv : bool
            If the instructions are equivalent.

        Notes
        -----
        This is a separate concept from instruction identity. Two separate
        instructions can be equivalent without being the same exact instance.
        This means that two equivalent instructions can be at different points
        in the bytecode or be targeted by different jumps.
        """
        return type(self) == type(instr) and self.arg == instr.arg


class _RawArg(int):
    """A class to hold arguments that are not yet initialized so that they
    don't break subclass's type checking code.

    This is used in the first pass of instruction creating in Code.from_pycode.
    """


def _mk_call_init(class_):
    """Create an __init__ function for a call type instruction.

    Parameters
    ----------
    class_ : type
        The type to bind the function to.

    Returns
    -------
    __init__ : callable
        The __init__ method for the class.
    """
    def __init__(self, packed=no_default, *, positional=0, keyword=0):
        if packed is no_default:
            arg = int.from_bytes(bytes((positional, keyword)), 'little')
        elif not positional and not keyword:
            arg = packed
        else:
            raise TypeError('cannot specify packed and unpacked arguments')
        self.positional, self.keyword = arg.to_bytes(2, 'little')
        super(class_, self).__init__(arg)

    return __init__


def _call_repr(self):
    return '%s(positional=%d, keyword=%d)' % (
        type(self).__name__,
        self.positional,
        self.keyword,
    )


def _check_jmp_arg(self, arg):
    if not isinstance(arg, (Instruction, _RawArg)):
        raise TypeError(
            'argument to %s must be an instruction, got: %r' % (
                type(self).__name__, arg,
            ),
        )
    if isinstance(arg, Instruction):
        arg._target_of.add(self)
    return arg


class CompareOpMeta(InstructionMeta):
    """
    Special-case metaclass for the COMPARE_OP instruction type that provides
    default constructors for the various kinds of comparisons.

    These default constructors are implemented as descriptors so that we can
    write::

        new_compare = COMPARE_OP.LT

    and have it be equivalent to::

        new_compare = COMPARE_OP(COMPARE_OP.comparator.LT)
    """

    @unique
    class comparator(IntEnum):
        LT = 0
        LE = 1
        EQ = 2
        NE = 3
        GT = 4
        GE = 5
        IN = 6
        NOT_IN = 7
        IS = 8
        IS_NOT = 9
        EXCEPTION_MATCH = 10

        def __repr__(self):
            return '<COMPARE_OP.%s.%s: %r>' % (
                self.__class__.__name__, self._name_, self._value_,
            )

    class ComparatorDescr:
        """
        A descriptor on the **metaclass** of COMPARE_OP that constructs new
        instances of COMPARE_OP on attribute access.

        Parameters
        ----------
        op : comparator
            The element of the `comparator` enum that this descriptor will
            forward to the COMPARE_OP constructor.
        """
        def __init__(self, op):
            self._op = op

        def __get__(self, instance, owner):
            # Since this descriptor is added to the current metaclass,
            # ``instance`` here is the COMPARE_OP **class**.

            if instance is None:
                # If someone does `CompareOpMeta.LT`, give them back the
                # descriptor object itself.
                return self

            # If someone does `COMPARE_OP.LT`, return a **new instance** of
            # COMPARE_OP.
            # We create new instances so that consumers can take ownership
            # without worrying about other jumps targeting the new instruction.
            return instance(self._op)

    # Dynamically add an instance of ComparatorDescr for each comparator
    # opcode.
    # This is equivalent to doing:
    # LT = ComparatorDescr(comparator.LT)
    # GT = ComparatorDescr(comparator.GT)
    # ...
    for c in comparator:
        locals()[c._name_] = ComparatorDescr(c)
    del c
    del ComparatorDescr


metamap = {
    'COMPARE_OP': CompareOpMeta,
}


globals_ = globals()
for name, opcode in opmap.items():
    globals_[name] = class_ = metamap.get(name, InstructionMeta)(
        opname[opcode],
        (Instruction,), {
            '__module__': __name__,
            '__qualname__': '.'.join((__name__, name)),
        },
        opcode=opcode,
    )
    if name.startswith('CALL_FUNCTION'):
        class_.__init__ = _mk_call_init(class_)
        class_.__repr__ = _call_repr

    if name == 'COMPARE_OP':
        class_._normalize_arg = staticmethod(class_.comparator)

    if class_.is_jmp:
        class_._normalize_arg = _check_jmp_arg

    class_.__doc__ = (
        """
        See Also
        --------
        dis.{name}
        """.format(name=name),
    )

    del class_


# Clean up the namespace
del name
del globals_
del metamap
del _check_jmp_arg
del _call_repr
del _mk_call_init

# The instructions that use the co_names tuple.
uses_name = frozenset(
    filter(attrgetter('uses_name'), Instruction.__subclasses__()),
)
# The instructions that use the co_varnames tuple.
uses_varname = frozenset(
    filter(attrgetter('uses_varname'), Instruction.__subclasses__()),
)
# The instructions that use the co_freevars tuple.
uses_free = frozenset(
    filter(attrgetter('uses_free'), Instruction.__subclasses__()),
)


================================================
FILE: codetransformer/patterns.py
================================================
from operator import methodcaller, index, attrgetter
import re
from types import MethodType

from .utils.instance import instance
from .utils.immutable import immutable


#: The default startcode for patterns.
DEFAULT_STARTCODE = 0
mcompile = methodcaller('mcompile')


def _prepr(m):
    if isinstance(m, or_):
        return '(%r)' % m

    return repr(m)


def coerce_ellipsis(p):
    """Convert ... into a matchany
    """
    if p is ...:
        return matchany

    return p


class matchable:
    """Mixin for defining the operators on patterns.
    """
    def __or__(self, other):
        other = coerce_ellipsis(other)
        if self is other:
            return self

        if not isinstance(other, matchable):
            return NotImplemented

        patterns = []
        if isinstance(self, or_):
            patterns.extend(self.matchables)
        else:
            patterns.append(self)
        if isinstance(other, or_):
            patterns.extend(other.matchables)
        else:
            patterns.append(other)

        return or_(*patterns)

    def __ror__(self, other):
        # Flip the order on the or method
        if not isinstance(other, matchable):
            return NotImplemented

        return type(self).__or__(coerce_ellipsis(other), self)

    def __invert__(self):
        return not_(self)

    def __getitem__(self, key):
        try:
            n = index(key)
        except TypeError:
            pass
        else:
            return matchrange(self, n)

        if isinstance(key, tuple) and len(key) in (1, 2):
            return matchrange(self, *key)

        if isinstance(key, modifier):
            return postfix_modifier(self, key)

        raise TypeError('invalid modifier: {0}'.format(key))


class postfix_modifier(immutable, matchable):
    """A pattern with a modifier paired with it.
    """
    __slots__ = 'matchable', 'modifier'

    def mcompile(self):
        return self.matchable.mcompile() + self.modifier.mcompile()

    def __repr__(self):
        return '%r[%r]' % (self.matchable, self.modifier)
    __str__ = __repr__


class meta(matchable):
    """Class for meta patterns and pattern likes. for example: ``matchany``.
    """
    def mcompile(self):
        return self._token

    def __repr__(self):
        return self._token.decode('utf-8')
    __str__ = __repr__


class modifier(meta):
    """Marker class for modifier types.
    """
    pass


@instance
class var(modifier):
    """Modifier that matches zero or more of a pattern.
    """
    _token = b'*'


@instance
class plus(modifier):
    """Modifier that matches one or more of a pattern.
    """
    _token = b'+'


@instance
class option(modifier):
    """Modifier that matches zero or one of a pattern.
    """
    _token = b'?'


class matchrange(immutable, meta, defaults={'m': None}):
    __slots__ = 'matchable', 'n', 'm'

    def mcompile(self):
        m = self.m
        return (
            self.matchable.mcompile() +
            b'{' +
            bytes(str(self.n), 'utf-8') +
            b',' + (b'' if m is None else (b', ' + bytes(str(m), 'utf-8'))) +
            b'}'
        )

    def __repr__(self):
        return '{matchable}[{args}]'.format(
            matchable=_prepr(self.matchable),
            args=', '.join(map(str, filter(bool, (self.n, self.m)))),
        )


@instance
class matchany(meta):
    """Matchable that matches any instruction.
    """
    _token = b'.'

    def __repr__(self):
        return '...'


class seq(immutable, matchable):
    """A sequence of matchables to match in order.

    Parameters
    ----------
    \*matchables : iterable of matchable
        The matchables to match against.
    """
    __slots__ = 'matchables',

    def __new__(cls, *matchables):
        if not matchables:
            raise TypeError('cannot create an empty sequence')

        if len(matchables) == 1:
            return coerce_ellipsis(matchables[0])
        return super().__new__(cls)

    def __init__(self, *matchables):
        self.matchables = tuple(map(coerce_ellipsis, matchables))

    def mcompile(self):
        return b''.join(map(mcompile, self.matchables))

    def __repr__(self):
        return '{cls}({args})'.format(
            cls=type(self).__name__,
            args=', '.join(map(_prepr, self.matchables))
        )


class or_(immutable, matchable):
    """Logical or of multiple matchables.

    Parameters
    ----------
    *matchables : iterable of matchable
        The matchables to or together.
    """
    __slots__ = '*matchables',

    def mcompile(self):
        return b'(' + b'|'.join(map(mcompile, self.matchables)) + b')'

    def __repr__(self):
        return ' | '.join(map(_prepr, self.matchables))


class not_(immutable, matchable):
    """Logical not of a matchable.
    """
    __slots__ = 'matchable',

    def mcompile(self):
        matchable = self.matchable
        if isinstance(matchable, (seq, or_, not_)):
            return b'((?!(' + matchable.mcompile() + b')).)*'

        return b'[^' + matchable.mcompile() + b']'

    def __repr__(self):
        return '~' + _prepr(self.matchable)


class pattern(immutable):
    """
    A pattern of instructions that can be matched against.

    This class is intended to be used as a decorator on methods of
    CodeTransformer subclasses.  It is used to mark that a given method should
    be called on sequences of instructions that match the pattern described by
    the inputs.

    Parameters
    ----------
    \*matchables : iterable of matchable
        The type of instructions to match against.
    startcodes : container of any
        The startcodes where this pattern should be tried.

    Examples
    --------
    Match a single BINARY_ADD instruction::

        pattern(BINARY_ADD)

    Match a single BINARY_ADD followed by a RETURN_VALUE::

        pattern(BINARY_ADD, RETURN_VALUE)

    Match a single BINARY_ADD followed by any other single instruction::

        pattern(BINARY_ADD, matchany)

    Match a single BINARY_ADD followed by any number of instructions::

        pattern(BINARY_ADD, matchany[var])
    """
    __slots__ = 'matchable', 'startcodes', '_compiled'

    def __init__(self, *matchables, startcodes=(DEFAULT_STARTCODE,)):
        if not matchables:
            raise TypeError('expected at least one matchable')
        self.matchable = matchable = seq(*matchables)
        self.startcodes = startcodes
        self._compiled = re.compile(matchable.mcompile())

    def __call__(self, f):
        return boundpattern(self._compiled, self.startcodes, f)

    def __repr__(self):
        return '{cls}(matchable={m!r}, startcodes={s})'.format(
            cls=type(self).__name__,
            m=self.matchable,
            s=self.startcodes,
        )


class boundpattern(immutable):
    """A pattern bound to a function.
    """
    __slots__ = '_compiled', '_startcodes', '_f'

    def __get__(self, instance, owner):
        if instance is None:
            return self

        return type(self)(
            self._compiled,
            self._startcodes,
            MethodType(self._f, instance)
        )

    def __call__(self, compiled_instrs, instrs, startcode):
        if startcode not in self._startcodes:
            raise NoMatch(compiled_instrs, startcode)

        match = self._compiled.match(compiled_instrs)
        if match is None or match.end is 0:
            raise NoMatch(compiled_instrs, startcode)

        mend = match.end()
        return self._f(*instrs[:mend]), mend


class NoMatch(Exception):
    """Indicates that there was no match found in this dispatcher.
    """
    pass


class patterndispatcher(immutable):
    """A set of patterns that can dispatch onto instrs.
    """
    __slots__ = '*patterns',

    def __get__(self, instance, owner):
        if instance is None:
            return self

        return boundpatterndispatcher(
            instance,
            *map(
                methodcaller('__get__', instance, owner),
                self.patterns,
            )
        )


class boundpatterndispatcher(immutable):
    """A set of patterns bound to a transformer.
    """
    __slots__ = 'transformer', '*patterns'

    def _dispatch(self, compiled_instrs, instrs, startcode):
        for p in self.patterns:
            try:
                return p(compiled_instrs, instrs, startcode)
            except NoMatch:
                pass

        raise NoMatch(instrs, startcode)

    def __call__(self, instrs):
        opcodes = bytes(map(attrgetter('opcode'), instrs))
        idx = 0  # The current index into the pre-transformed instrs.
        post_transform = []  # The instrs that have been transformed.
        transformer = self.transformer
        while idx < len(instrs):
            try:
                processed, nconsumed = self._dispatch(
                    opcodes[idx:],
                    instrs[idx:],
                    # NOTE: do not remove this attribute access
                    # self._dispatch can mutate the value of the startcode
                    transformer.startcode,
                )
            except NoMatch:
                post_transform.append(instrs[idx])
                idx += 1
            else:
                post_transform.extend(processed)
                idx += nconsumed
        return tuple(post_transform)


================================================
FILE: codetransformer/tests/__init__.py
================================================


================================================
FILE: codetransformer/tests/test_code.py
================================================
from dis import dis
from io import StringIO
from itertools import product, chain
import random
import sys

import pytest

from codetransformer.code import Code, Flag, pycode
from codetransformer.instructions import LOAD_CONST, LOAD_FAST, uses_free


@pytest.fixture(scope='module')
def sample_flags(request):
    random.seed(8025816322119661921)  # ayy lmao
    nflags = len(Flag.__members__)
    return tuple(
        dict(zip(Flag.__members__.keys(), case)) for case in chain(
            random.sample(list(product((True, False), repeat=nflags)), 1000),
            [[True] * nflags],
            [[False] * nflags],
        )
    )


def test_lnotab_roundtrip():
    # DO NOT ADD EXTRA LINES HERE
    def f():  # pragma: no cover
        a = 1
        b = 2
        c = 3
        d = 4
        a, b, c, d

    start_line = test_lnotab_roundtrip.__code__.co_firstlineno + 3
    lines = [start_line + n for n in range(5)]
    code = Code.from_pycode(f.__code__)
    lnotab = code.lnotab
    assert lnotab.keys() == set(lines)
    assert isinstance(lnotab[lines[0]], LOAD_CONST)
    assert lnotab[lines[0]].arg == 1
    assert isinstance(lnotab[lines[1]], LOAD_CONST)
    assert lnotab[lines[1]].arg == 2
    assert isinstance(lnotab[lines[2]], LOAD_CONST)
    assert lnotab[lines[2]].arg == 3
    assert isinstance(lnotab[lines[3]], LOAD_CONST)
    assert lnotab[lines[3]].arg == 4
    assert isinstance(lnotab[lines[4]], LOAD_FAST)
    assert lnotab[lines[4]].arg == 'a'
    assert f.__code__.co_lnotab == code.py_lnotab == code.to_pycode().co_lnotab


def test_lnotab_really_dumb_whitespace():
    ns = {}
    exec('def f():\n    lol = True' + '\n' * 1024 + '    wut = True', ns)
    f = ns['f']
    code = Code.from_pycode(f.__code__)
    lines = [2, 1026]
    lnotab = code.lnotab
    assert lnotab.keys() == set(lines)
    assert isinstance(lnotab[lines[0]], LOAD_CONST)
    assert lnotab[lines[0]].arg
    assert isinstance(lnotab[lines[1]], LOAD_CONST)
    assert lnotab[lines[1]].arg
    assert f.__code__.co_lnotab == code.py_lnotab == code.to_pycode().co_lnotab


def test_flag_packing(sample_flags):
    for flags in sample_flags:
        assert Flag.unpack(Flag.pack(**flags)) == flags


def test_flag_unpack_too_big():
    assert all(Flag.unpack(Flag.max).values())
    with pytest.raises(ValueError):
        Flag.unpack(Flag.max + 1)


def test_flag_max():
    assert Flag.pack(
        CO_OPTIMIZED=True,
        CO_NEWLOCALS=True,
        CO_VARARGS=True,
        CO_VARKEYWORDS=True,
        CO_NESTED=True,
        CO_GENERATOR=True,
        CO_NOFREE=True,
        CO_COROUTINE=True,
        CO_ITERABLE_COROUTINE=True,
        CO_FUTURE_DIVISION=True,
        CO_FUTURE_ABSOLUTE_IMPORT=True,
        CO_FUTURE_WITH_STATEMENT=True,
        CO_FUTURE_PRINT_FUNCTION=True,
        CO_FUTURE_UNICODE_LITERALS=True,
        CO_FUTURE_BARRY_AS_BDFL=True,
        CO_FUTURE_GENERATOR_STOP=True,
    ) == Flag.max


def test_flag_max_immutable():
    with pytest.raises(AttributeError):
        Flag.CO_OPTIMIZED.max = None


def test_code_multiple_varargs():
    with pytest.raises(ValueError) as e:
        Code(
            (), (
                '*args',
                '*other',
            ),
        )

    assert str(e.value) == 'cannot specify *args more than once'


def test_code_multiple_kwargs():
    with pytest.raises(ValueError) as e:
        Code(
            (), (
                '**kwargs',
                '**kwargs',
            ),
        )

    assert str(e.value) == 'cannot specify **kwargs more than once'


@pytest.mark.parametrize('cls', uses_free)
def test_dangling_var(cls):
    instr = cls('dangling')
    with pytest.raises(ValueError) as e:
        Code((instr,))

    assert (
        str(e.value) ==
        "Argument to %r is not in cellvars or freevars." % instr
    )


def test_code_flags(sample_flags):
    attr_map = {
        'CO_NESTED': 'is_nested',
        'CO_GENERATOR': 'is_generator',
        'CO_COROUTINE': 'is_coroutine',
        'CO_ITERABLE_COROUTINE': 'is_iterable_coroutine',
        'CO_NEWLOCALS': 'constructs_new_locals',
    }
    for flags in sample_flags:
        if sys.version_info < (3, 6):
            codestring = b'd\x00\x00S'  # return None
        else:
            codestring = b'd\x00S'  # return None

        code = Code.from_pycode(pycode(
            argcount=0,
            kwonlyargcount=0,
            nlocals=2,
            stacksize=0,
            flags=Flag.pack(**flags),
            codestring=codestring,
            constants=(None,),
            names=(),
            varnames=('a', 'b'),
            filename='',
            name='',
            firstlineno=0,
            lnotab=b'',
        ))
        assert code.flags == flags
        for flag, attr in attr_map.items():
            if flags[flag]:
                assert getattr(code, attr)


@pytest.fixture
def abc_code():
    a = LOAD_CONST('a')
    b = LOAD_CONST('b')
    c = LOAD_CONST('c')  # not in instrs
    code = Code((a, b), argnames=())

    return (a, b, c), code


def test_instr_index(abc_code):
    (a, b, c), code = abc_code

    assert code.index(a) == 0
    assert code.index(b) == 1

    with pytest.raises(ValueError):
        code.index(c)


def test_code_contains(abc_code):
    (a, b, c), code = abc_code

    assert a in code
    assert b in code
    assert c not in code


def test_code_dis(capsys):
    @Code.from_pyfunc
    def code():  # pragma: no cover
        a = 1
        b = 2
        return a, b

    buf = StringIO()
    dis(code.to_pycode(), file=buf)
    expected = buf.getvalue()

    code.dis()
    out, err = capsys.readouterr()
    assert not err
    assert out == expected

    buf = StringIO()
    code.dis(file=buf)
    assert buf.getvalue() == expected


================================================
FILE: codetransformer/tests/test_core.py
================================================
import pytest
import toolz.curried.operator as op

from codetransformer import CodeTransformer, Code, pattern
from codetransformer.core import Context, NoContext
from codetransformer.instructions import Instruction
from codetransformer.patterns import DEFAULT_STARTCODE
from codetransformer.utils.instance import instance


def test_inherit_patterns():
    class C(CodeTransformer):
        matched = False

        @pattern(...)
        def _(self, instr):
            self.matched = True
            yield instr

    class D(C):
        pass

    d = D()
    assert not d.matched

    @d
    def f():
        pass

    assert d.matched


def test_override_patterns():
    class C(CodeTransformer):
        matched_super = False
        matched_sub = False

        @pattern(...)
        def _(self, instr):
            self.matched_super = True
            yield instr

    class D(C):
        @pattern(...)
        def _(self, instr):
            self.matched_sub = True
            yield instr

    d = D()
    assert not d.matched_super
    assert not d.matched_sub

    @d
    def f():
        pass

    assert d.matched_sub
    assert not d.matched_super


def test_updates_lnotab():
    @instance
    class c(CodeTransformer):
        @pattern(...)
        def _(self, instr):
            yield type(instr)(instr.arg).steal(instr)

    def f():  # pragma: no cover
        # this function has irregular whitespace for testing the lnotab
        a = 1
        # intentional line
        b = 2
        # intentional line
        c = 3
        # intentional line
        return a, b, c

    original = Code.from_pyfunc(f)
    post_transform = c.transform(original)

    # check that something happened
    assert original.lnotab != post_transform.lnotab
    # check that we preserved the line numbers
    assert (
        original.lnotab.keys() ==
        post_transform.lnotab.keys() ==
        set(map(op.add(original.firstlineno), (2, 4, 6, 8)))
    )

    def sorted_instrs(lnotab):
        order = sorted(lnotab.keys())
        for idx in order:
            yield lnotab[idx]

    # check that the instrs are correct
    assert all(map(
        Instruction.equiv,
        sorted_instrs(original.lnotab),
        sorted_instrs(post_transform.lnotab),
    ))

    # sanity check that the function is correct
    assert f() == c(f)()


def test_context():
    def f():  # pragma: no cover
        pass

    code = Code.from_pyfunc(f)
    c = Context(code)

    # check default attributes
    assert c.code is code
    assert c.startcode == DEFAULT_STARTCODE

    # check that the object acts like a namespace
    c.attr = 'test'
    assert c.attr == 'test'


def test_no_context():
    @instance
    class c(CodeTransformer):
        pass

    with pytest.raises(NoContext) as e:
        c.context

    assert str(e.value) == 'no active transformation context'


================================================
FILE: codetransformer/tests/test_decompiler.py
================================================
"""
Tests for decompiler.py
"""
from ast import AST, iter_fields, Module, parse
from functools import partial
from itertools import product, zip_longest, combinations_with_replacement
import sys
from textwrap import dedent

import pytest
from toolz.curried.operator import add

from codetransformer import a as show  # noqa

_343 = sys.version_info[:3] == (3, 4, 3)
pytestmark = pytest.mark.skipif(
    not _343,
    reason='decompiler only runs on 3.4',
)
if _343:
    from ..decompiler import (
        DecompilationContext,
        decompile,
        paramnames,
        pycode_to_body,
    )

_current_test = None


def make_indented_body(body_str):
    """
    Helper for generating an indented string to use as the body of a function.
    """
    return '\n'.join(
        map(
            add("    "),
            dedent(body_str).splitlines(),
        )
    )


def compare(computed, expected):
    """
    Assert that two AST nodes are the same.
    """
    assert type(computed) == type(expected)

    if isinstance(computed, list):
        for cv, ev in zip_longest(computed, expected):
            compare(cv, ev)
        return

    if not isinstance(computed, AST):
        assert computed == expected
        return

    for (cn, cv), (en, ev) in zip_longest(*map(iter_fields,
                                               (computed, expected))):
        assert cn == en
        compare(cv, ev)


def check(text, ast_text=None):
    """
    Check that compiling and disassembling `text` produces the same AST tree as
    calling ast.parse on `ast_text`.  If `ast_text` is not passed, use `text`
    for both.
    """
    global _current_test
    _current_test = text

    if ast_text is None:
        ast_text = text

    ast = parse(ast_text)

    code = compile(text, '<test>', 'exec')

    decompiled_ast = Module(
        body=pycode_to_body(code, DecompilationContext()),
    )

    compare(decompiled_ast, ast)


def check_formatted(text, ast_text=None, **fmt_kwargs):
    text = text.format(**fmt_kwargs)
    if ast_text is not None:
        ast_text = ast_text.format(**fmt_kwargs)
    check(text, ast_text)


# Bodies for for/while loops.
LOOP_BODIES = tuple(map(
    '\n'.join,
    combinations_with_replacement(
        [
            "x = 1",
            "break",
            "continue",
            dedent(
                """\
                while u + v:
                    w = z
                """,
            ),
            dedent(
                """\
                for u in v:
                    w = z
                """,
            ),
        ],
        3,
    ),
))
# Bodies for for-else/while-else blocks.
ORELSE_BODIES = ["", "x = 3"]
# LHS of assignment, or bindings in a for-loop.
NAME_BINDS = [
    "a",
    "(a, b)",
    "(a,)",
    "a, ((b, c, d), (e, f))",
]


def test_decompile():
    def foo(a, b, *, c):
        return a + b + c
    decompiled = decompile(foo)

    # NOTE: We can't reliably match the ast for defaults and annotations, since
    # we can't tell how they were defined.
    s = dedent(
        """
        def foo(a, b, *, c):
            return a + b + c
        """
    )
    compiled = parse(s)
    compare(decompiled, compiled.body[0])


def test_trivial_expr():
    check("a")


@pytest.mark.parametrize(
    'lhs,rhs', product(NAME_BINDS, ['x', 'x.y() + z.w()']),
)
def test_assign(lhs, rhs):
    check("{lhs} = {rhs}".format(lhs=lhs, rhs=rhs))


def test_unpack_to_attribute():
    check("((a.b, c.d.e), f) = g")
    check("((a[b], c[d][e]), f) = g")
    check("((a[b].c, d.e[f]), g) = h")


def test_chained_assign():
    check("a = b = c = d")
    check("a.b = (c,) = d[e].f = g")
    check("a.b = (c, d[e].f) = g")


def test_unary_not():
    check("a = not b")
    check("a = not not b")
    check("a = not ((not a) + b)")


@pytest.mark.parametrize(
    'op', [
        '+',
        '-',
        '*',
        '**',
        '/',
        '//',
        '%',
        '<<',
        '>>',
        '&',
        '^',
        '|',
    ]
)
def test_binary_ops(op):
    check("a {op} b".format(op=op))
    check("a = b {op} c".format(op=op))
    check("a = (b {op} c) {op} d".format(op=op))
    check("a = b {op} (c {op} d)".format(op=op))


def test_string_literal():
    # A string literal as the first expression in a module generates a
    # STORE_NAME to __doc__.  We can't tell the difference between this and an
    # actual assignment to __doc__.
    check("'a'", "__doc__ = 'a'")
    check("'abc'", "__doc__ = 'abc'")

    check("a = 'a'")
    check("a = u'a'")


def test_bytes_literal():
    check("b'a'")
    check("b'abc'")
    check("a = b'a'")


def test_int_literal():
    check("1", "")  # This gets constant-folded out
    check("a = 1")
    check("a = 1 + b")
    check("a = b + 1")


def test_float_literal():
    check('1.0', "")   # This gets constant-folded out
    check("a = 1.0")
    check("a = 1.0 + b")
    check("a = b + 1.0")


def test_complex_literal():
    check('1.0j', "")  # This gets constant-folded out
    check("a = 1.0j")
    check("a = 1.0j + b")
    check("a = b + 1.0j")


def test_tuple_literals():
    check("()")
    check("(1,)")
    check("(a,)")
    check("(1, a)")
    check("(1, 'a')")
    check("((1,), a)")
    check("((1,(b,)), a)")


def test_set_literals():
    check("{1}")
    check("{1, 'a'}")
    check("a = {1, 'a'}")


def test_list_literals():
    check("[]")
    check("[1]")
    check("[a]")
    check("[[], [a, 1]]")


def test_dict_literals():
    check("{}")
    check("{a: b}")
    check("{a + a: b + b}")
    check("{a: b, c: d}")
    check("{1: 2, c: d}")
    check("{a: {b: c}, d: e}")

    check("{a: {b: {c: d}, e: {f: g}}}")

    check("{a: {b: [c, d, e]}}")
    check("a + {b: c}")


def test_function_call():
    check("f()")
    check("f(a, b, c=1, d=2)")

    check("f(*args)")
    check("f(a, b=1, *args)")

    check("f(**kwargs)")
    check("f(a, b=1, **kwargs)")

    check("f(*args, **kwargs)")
    check("f(a, b=1, *args, **kwargs)")

    check("(a + b)()")
    check("a().b.c.d()")


def test_paramnames():

    def foo(a, b):
        x = 1
        return x

    args, kwonlyargs, varargs, varkwargs = paramnames(foo.__code__)
    assert args == ('a', 'b')
    assert kwonlyargs == ()
    assert varargs is None
    assert varkwargs is None

    def bar(a, *, b):
        x = 1
        return x

    args, kwonlyargs, varargs, varkwargs = paramnames(bar.__code__)
    assert args == ('a',)
    assert kwonlyargs == ('b',)
    assert varargs is None
    assert varkwargs is None

    def fizz(a, **kwargs):
        x = 1
        return x

    args, kwonlyargs, varargs, varkwargs = paramnames(fizz.__code__)
    assert args == ('a',)
    assert kwonlyargs == ()
    assert varargs is None
    assert varkwargs == 'kwargs'

    def buzz(a, b=1, *args, c, d=3, **kwargs):
        x = 1
        return x

    args, kwonlyargs, varargs, varkwargs = paramnames(buzz.__code__)
    assert args == ('a', 'b')
    assert kwonlyargs == ('c', 'd')
    assert varargs == 'args'
    assert varkwargs == 'kwargs'


@pytest.mark.parametrize(
    "signature,expr",
    product(
        [
            "",
            "a",
            "a, b",
            "*a, b",
            "a, **b",
            "*a, **b",
            "a=1, b=2, c=3",
            "a, *, b=1, c=2, d=3",
            "a, b=1, c=2, *, d, e=3, f, g=4",
            "a, b=1, *args, c, d=2, **kwargs",
            "a, b=c + d, *, e=f + g",
        ],
        [
            "a + b",
            "None",
            "lambda x: lambda y: lambda z: (x, y, z)",
            "[lambda x: a + b, 1]",
            "[(lambda y: a + b) + (lambda z: d + e), 1]",
        ],
    ),
)
def test_lambda(signature, expr):
    check_formatted("lambda {sig}: {expr}", sig=signature, expr=expr)
    check_formatted("func = (lambda {sig}: {expr})", sig=signature, expr=expr)
    check_formatted(
        dedent(
            """
            def foo():
                return (lambda {sig}: {expr})()
            """
        ),
        sig=signature,
        expr=expr,
    )


def test_simple_function():
    check(
        dedent(
            """\
            def foo(a, b):
                return a + b
            """
        )
    )


def test_annotations():
    check(
        dedent(
            """\
            def foo(a: b, c: d):
                return 3
            """
        )
    )
    check(
        dedent(
            """\
            def foo(a: b, c=1, *args: d, e:f, g:h=i, **kwargs: j):
                return a + c
            """
        )
    )
    check(
        dedent(
            """\
            def foo(a: b * 3, c=1, *args: d, e:f, g:h=i, **kwargs: j) -> k:
                return a + c
            """
        )
    )


@pytest.mark.parametrize(
    "signature,body",
    product(
        [
            "()",
            "(a)",
            "(a, b)",
            "(*a, b)",
            "(a, **b)",
            "(*a, **b)",
            "(a=1, b=2, c=3)",
            "(a, *, b=1, c=2, d=3)",
            "(a, b=1, c=2, *, d, e=3, f, g=4)",
            "(a, b=1, *args, c, d=2, **kwargs)",
            "(a, b=c + d, *, e=f + g)",
        ],
        [
            """\
            return a + b
            """,
            """\
            x = 1
            y = 2
            return x + y
            """,
            """\
            x = 3
            def bar(m, n):
                global x
                x = 4
                return m + n + x
            return None
            """,
            """\
            def bar():
                x = 3
                def buzz():
                    nonlocal x
                    x = 4
                    return x
                return x
            return None
            """
        ],
    ),
)
def test_function_signatures(signature, body):
    check(
        dedent(
            """\
            def foo{signature}:
            {body}
            """
        ).format(signature=signature, body=make_indented_body(body))
    )


def test_decorators():
    check(
        dedent(
            """
            @decorator2
            @decorator1()
            @decorator0.attr.attr
            def foo(a, b=1, *, c, d=2):
                @decorator3
                def bar(c, d):
                    x = 1
                    return None
                return None
            """
        )
    )


def test_store_twice_to_global():
    check(
        dedent(
            """\
            x = 3
            def foo():
                global x
                x = 4
                x = 5
                return None
            """
        )
    )


def test_store_twice_to_nonlocal():
    check(
        dedent(
            """\
            def foo():
                x = 1
                def bar():
                    nonlocal x
                    x = 2
                    x = 3
                    return None
                return None
            """
        )
    )


def test_getattr():
    check("a.b")
    check("a.b.c")
    check("a.b.c + a.b.c")

    check("(1).real")
    check("1..real")

    check("(a + b).c")

    check("a = b.c")


def test_setattr():
    check("a.b = c")
    check("a.b.c = d")
    check("a.b.c = d.e.f")
    check("(a + b).c = (d + e).f")


def test_getitem():
    check("a = b[c]")
    check("a = b[c:]")
    check("a = b[:c]")
    check("a = b[c::]")
    check("a = b[c:d]")
    check("a = b[c:d:e]")

    check("a = b[c, d]")
    check("a = b[c:, d]")
    check("a = b[c:d:e, f:g:h, i:j:k]")

    check("a = b[c + d][e]")


def test_setitem():
    check("a[b] = c")
    check("b[c:] = a")
    check("b[:c] = a")
    check("b[c::] = a")
    check("b[c:d] = a")
    check("b[c:d:e] = a")

    check("b[c, d] = a")
    check("b[c:, d] = a")
    check("b[c:d:e, f:g:h, i:j:k] = a")

    check("b[c + d][e] = a")


@pytest.mark.parametrize(
    "loop,body,else_body",
    product(
        [
            "for a in b:",
            "for a in b.c.d:",
            "for (a, (b, c), d) in e:"
        ],
        LOOP_BODIES,
        ORELSE_BODIES,
    )
)
def test_for(loop, body, else_body):
    check(
        dedent(
            """\
            {loop}
            {body}
            {else_}
            {else_body}
            x = 4
            """
        ).format(
            loop=loop,
            body=make_indented_body(body),
            else_="else:" if else_body else "",
            else_body=make_indented_body(else_body) if else_body else "",
        )
    )


@pytest.mark.parametrize(
    "condition,body,else_body",
    product(
        [
            "a",
            "not a",
            "not not a",
            "a.b.c.d",
            "not a.b.c.d",
            "True",
        ],
        LOOP_BODIES,
        ORELSE_BODIES,
    )
)
def test_while(condition, body, else_body):
    check(
        dedent(
            """\
            while {condition}:
            {body}
            {else_}
            {else_body}
            x = 4
            """
        ).format(
            condition=condition,
            body=make_indented_body(body),
            else_="else:" if else_body else "",
            else_body=make_indented_body(else_body) if else_body else "",
        )
    )


def test_while_False():
    # The peephole optimizer removes while <falsey constant> blocks entirely.
    check(
        dedent(
            """\
            while False:
                x = 1
                y = 2
            """
        ),
        ""
    )


def test_import():
    check("import a as b")
    check("import a.b as c")
    # These generate identical bytecode.
    check(
        "import a, b",
        dedent(
            """\
            import a
            import b
            """
        )
    )
    check("import a.b.c")
    # These generate identical bytecode.
    check(
        "import a.b.c as d, e.f.g as h",
        dedent(
            """
            import a.b.c as d
            import e.f.g as h
            """
        )
    )


def test_import_from():
    check("from a import b")
    check("from a import b, c as d, d")
    check("from a.b import c, d as e, f as g")


def test_import_star():
    check("from a import *")
    check("from a.b.c import *")


def test_import_attribute_aliasing_module():
    check("import a.b as a")


def test_import_in_function():
    check(
        dedent(
            """\
            def foo():
                import a.b.c as d
                from e.f import g
                return None
            """
        )
    )
    check(
        dedent(
            """\
            def foo():
                global d, g
                import a.b.c as d
                from e.f import g
                return None
            """
        )
    )
    check(
        dedent(
            """\
            def foo():
                d = None
                g = None
                def bar():
                    nonlocal d, g
                    import a.b.c as d
                    from e.f import g
                    return None
                return None
            """
        )
    )


def test_with_block():
    check(
        dedent(
            """
            with a.b.c:
                c = d
                e = f()
            """
        )
    )

    # Tests for various kinds of stores from the with assignment.
    check(
        dedent(
            """
            with a as b:
                c = d
            """
        )
    )

    check(
        dedent(
            """
            def foo():
                with a as b:
                    c = d
                return None
            """
        )
    )

    check(
        dedent(
            """
            def foo():
                global b
                with a as b:
                    c = d
                return None
            """
        )
    )

    check(
        dedent(
            """
            def foo():
                with a as b:
                    def bar():
                        nonlocal b
                        b = None
                        return c
                return None
            """
        )
    )


def test_nested_with():
    check(
        dedent(
            """
            with a:
                with b:
                    with c:
                        x = 3
                    y = 4
                z = 5
            """
        )
    )
    # This is indistinguishable in bytecode from:
    # with a:
    #     with b:
    #         with c as d:
    #             e = f
    # We normalize the former to the latter.
    check(
        dedent(
            """
            with a, b, c as d:
                e = f
            """
        ),
    )


def test_simple_if():
    check(
        dedent(
            """
            if a:
                b = c
            x = "end"
            """
        )
    )


def test_if_return():
    check(
        dedent(
            """
            def f():
                if a:
                    return b
                return None
            """
        )
    )


def test_if_else():
    check(
        dedent(
            """
            if a:
                b = c
            else:
                b = d
            x = "end"
            """
        )
    )


@pytest.mark.parametrize(
    'last_statement,prefix',
    product(
        ("", "x = 'end'"),
        ("not", "not not"),
    ),
)
def test_if_elif(last_statement, prefix):
    check(
        dedent(
            """\
            if {prefix} a:
                b = c
            elif d:
                e = f
            elif {prefix} g:
                h = i
            else:
                j = k
            {last_statement}
            """
        ).format(prefix=prefix, last_statement=last_statement)
    )

    check(
        dedent(
            """
            if a:
                x = "before_b"
                if {prefix} b:
                    x = "in_b"
                elif b:
                    x = "in_elif_b"
                else:
                    x = "else_b"
                w = "after_b"
            elif c:
                x = "in_c"
            else:
                x = "in_else"
            {last_statement}
            """
        ).format(prefix=prefix, last_statement=last_statement)
    )


@pytest.mark.parametrize(
    'op', ['and', 'or'],
)
def test_boolops(op):
    check_ = partial(check_formatted, op=op)

    check_("a {op} b")
    check_("a {op} b {op} c")
    check_("a + (b {op} c)")
    check_("(a {op} b) + c")
    check_("(a + b) {op} (c + d)")
    check_("a + (b {op} c) + d")

    check_("a {op} (1 + (b {op} c))")


@pytest.mark.parametrize(
    'op', ['and', 'or'],
)
def test_normalize_nested_boolops(op):
    check_ = partial(check_formatted, op=op)

    # These generate identical bytecode, but they're different at the AST
    # level.  We normalize to minimally-nested form.
    check_("a {op} (b {op} c)", "a {op} b {op} c")
    check_("(a {op} b) {op} c", "a {op} b {op} c")

    check_("a {op} (b {op} (c {op} d))", "a {op} b {op} c {op} d")
    check_("((a {op} b) {op} c) {op} d", "a {op} b {op} c {op} d")

    check_("(a {op} b) {op} (c {op} d)", "a {op} b {op} c {op} d")
    check_("a {op} (b {op} c) {op} d", "a {op} b {op} c {op} d")


def test_mixed_boolops():
    check("a or b and c and d")


================================================
FILE: codetransformer/tests/test_instructions.py
================================================
from codetransformer.instructions import Instruction


def test_repr_types():
    assert repr(Instruction) == 'Instruction'
    for tp in Instruction.__subclasses__():
        assert repr(tp) == tp.opname


================================================
FILE: codetransformer/transformers/__init__.py
================================================
from .constants import asconstants
from .interpolated_strings import interpolated_strings
from .pattern_matched_exceptions import pattern_matched_exceptions
from .precomputed_slices import precomputed_slices
from .literals import (
    bytearray_literals,
    decimal_literals,
    haskell_strs,
    islice_literals,
    overloaded_complexes,
    overloaded_floats,
    overloaded_ints,
    overloaded_lists,
    overloaded_sets,
    overloaded_slices,
    overloaded_strs,
    overloaded_tuples,
)


__all__ = [
    'asconstants',
    'bytearray_literals',
    'decimal_literals',
    'haskell_strs',
    'interpolated_strings',
    'islice_literals',
    'overloaded_complexes',
    'overloaded_floats',
    'overloaded_ints',
    'overloaded_lists',
    'overloaded_sets',
    'overloaded_slices',
    'overloaded_strs',
    'overloaded_tuples',
    'pattern_matched_exceptions',
    'precomputed_slices',
]


================================================
FILE: codetransformer/transformers/add2mul.py
================================================
"""
add2mul
--------

A transformer that replaces BINARY_ADD instructions with BINARY_MULTIPLY
instructions.

This isn't useful, but it's good introductory example/tutorial material.
"""
from codetransformer import CodeTransformer, pattern
from codetransformer.instructions import BINARY_ADD, BINARY_MULTIPLY


class add2mul(CodeTransformer):
    @pattern(BINARY_ADD)
    def _add2mul(self, add_instr):
        yield BINARY_MULTIPLY().steal(add_instr)


================================================
FILE: codetransformer/transformers/constants.py
================================================
import builtins

from ..core import CodeTransformer
from ..instructions import (
    DELETE_DEREF,
    DELETE_FAST,
    DELETE_GLOBAL,
    DELETE_NAME,
    LOAD_CLASSDEREF,
    LOAD_CONST,
    LOAD_DEREF,
    LOAD_GLOBAL,
    LOAD_NAME,
    STORE_DEREF,
    STORE_FAST,
    STORE_GLOBAL,
    STORE_NAME,
)
from ..patterns import pattern


def _assign_or_del(type_):
    assert type_ in ('assign to', 'delete')

    def handler(self, instr):
        name = instr.arg
        if name not in self._constnames:
            yield instr
            return

        code = self.code
        filename = code.filename
        lno = code.lno_of_instr[instr]
        try:
            with open(filename) as f:
                line = f.readlines()[lno - 1]
        except IOError:
            line = '???'

        raise SyntaxError(
            "can't %s constant name %r" % (type_, name),
            (filename, lno, len(line), line),
        )

    return handler


class asconstants(CodeTransformer):
    """
    A code transformer that inlines names as constants.

    - Positional arguments are interpreted as names of builtins (e.g. ``len``,
      ``print``) to freeze as constants in the decorated function's namespace.

    - Keyword arguments provide additional custom names to freeze as constants.

    - If invoked with no positional or keyword arguments, ``asconstants``
      inlines all names in ``builtins``.

    Parameters
    ----------
    \*builtin_names
        Names of builtins to freeze as constants.
    \*\*kwargs
        Additional key-value pairs to bind as constants.

    Examples
    --------
    Freezing Builtins:

    >>> from codetransformer.transformers import asconstants
    >>>
    >>> @asconstants('len')
    ... def with_asconstants(x):
    ...     return len(x) * 2
    ...
    >>> def without_asconstants(x):
    ...     return len(x) * 2
    ...
    >>> len = lambda x: 0
    >>> with_asconstants([1, 2, 3])
    6
    >>> without_asconstants([1, 2, 3])
    0

    Adding Custom Constants:

    >>> @asconstants(a=1)
    ... def f():
    ...     return a
    ...
    >>> f()
    1
    >>> a = 5
    >>> f()
    1
    """
    def __init__(self, *builtin_names, **kwargs):
        super().__init__()
        bltins = vars(builtins)
        if not (builtin_names or kwargs):
            self._constnames = bltins.copy()
        else:
            self._constnames = constnames = {}
            for arg in builtin_names:
                constnames[arg] = bltins[arg]
            overlap = constnames.keys() & kwargs.keys()
            if overlap:
                raise TypeError('Duplicate keys: {!r}'.format(overlap))
            constnames.update(kwargs)

    def transform(self, code, **kwargs):
        overlap = self._constnames.keys() & set(code.argnames)
        if overlap:
            raise SyntaxError(
                'argument names overlap with constant names: %r' % overlap,
            )
        return super().transform(code, **kwargs)

    @pattern(LOAD_NAME | LOAD_GLOBAL | LOAD_DEREF | LOAD_CLASSDEREF)
    def _load_name(self, instr):
        name = instr.arg
        if name not in self._constnames:
            yield instr
            return

        yield LOAD_CONST(self._constnames[name]).steal(instr)

    _store = pattern(
        STORE_NAME | STORE_GLOBAL | STORE_DEREF | STORE_FAST,
    )(_assign_or_del('assign to'))
    _delete = pattern(
        DELETE_NAME | DELETE_GLOBAL | DELETE_DEREF | DELETE_FAST,
    )(_assign_or_del('delete'))


================================================
FILE: codetransformer/transformers/interpolated_strings.py
================================================
"""
A transformer implementing ruby-style interpolated strings.
"""
import sys

from codetransformer import pattern, CodeTransformer
from codetransformer.instructions import (
    BUILD_TUPLE,
    LOAD_CONST,
    LOAD_ATTR,
    CALL_FUNCTION,
    CALL_FUNCTION_KW,
    ROT_TWO,
)
from codetransformer.utils.functional import flatten, is_a


class interpolated_strings(CodeTransformer):
    """
    A transformer that interpolates local variables into string literals.

    Parameters
    ----------
    transform_bytes : bool, optional
        Whether to transform bytes literals to interpolated unicode strings.
        Default is True.
    transform_str : bool, optional
        Whether to interpolate values into unicode strings.
        Default is False.

    Example
    -------
    >>> @interpolated_strings()  # doctest: +SKIP
    ... def foo(a, b):
    ...     c = a + b
    ...     return b"{a} + {b} = {c}"
    ...
    >>> foo(1, 2)  # doctest: +SKIP
    '1 + 2 = 3'
    """

    if sys.version_info >= (3, 6):
        def __init__(self, *, transform_bytes=True, transform_str=False):
            raise NotImplementedError(
                '%s is not supported on 3.6 or newer, just use f-strings' %
                type(self).__name__,
            )
    else:
        def __init__(self, *, transform_bytes=True, transform_str=False):
            super().__init__()
            self._transform_bytes = transform_bytes
            self._transform_str = transform_str

    @property
    def types(self):
        """
        Tuple containing types transformed by this transformer.
        """
        out = []
        if self._transform_bytes:
            out.append(bytes)
        if self._transform_str:
            out.append(str)
        return tuple(out)

    @pattern(LOAD_CONST)
    def _load_const(self, instr):
        const = instr.arg

        if isinstance(const, (tuple, frozenset)):
            yield from self._transform_constant_sequence(const)
            return

        if isinstance(const, bytes) and self._transform_bytes:
            yield from self.transform_stringlike(const)
        elif isinstance(const, str) and self._transform_str:
            yield from self.transform_stringlike(const)
        else:
            yield instr

    def _transform_constant_sequence(self, seq):
        """
        Transform a frozenset or tuple.
        """
        should_transform = is_a(self.types)

        if not any(filter(should_transform, flatten(seq))):
            # Tuple doesn't contain any transformable strings. Ignore.
            yield LOAD_CONST(seq)
            return

        for const in seq:
            if should_transform(const):
                yield from self.transform_stringlike(const)
            elif isinstance(const, (tuple, frozenset)):
                yield from self._transform_constant_sequence(const)
            else:
                yield LOAD_CONST(const)

        if isinstance(seq, tuple):
            yield BUILD_TUPLE(len(seq))
        else:
            assert isinstance(seq, frozenset)
            yield BUILD_TUPLE(len(seq))
            yield LOAD_CONST(frozenset)
            yield ROT_TWO()
            yield CALL_FUNCTION(1)

    def transform_stringlike(self, const):
        """
        Yield instructions to process a str or bytes constant.
        """
        yield LOAD_CONST(const)
        if isinstance(const, bytes):
            yield from self.bytes_instrs
        elif isinstance(const, str):
            yield from self.str_instrs

    @property
    def bytes_instrs(self):
        """
        Yield instructions to call TOS.decode('utf-8').format(**locals()).
        """
        yield LOAD_ATTR('decode')
        yield LOAD_CONST('utf-8')
        yield CALL_FUNCTION(1)
        yield from self.str_instrs

    @property
    def str_instrs(self):
        """
        Yield instructions to call TOS.format(**locals()).
        """
        yield LOAD_ATTR('format')
        yield LOAD_CONST(locals)
        yield CALL_FUNCTION(0)
        yield CALL_FUNCTION_KW()


================================================
FILE: codetransformer/transformers/literals.py
================================================
from collections import OrderedDict
from decimal import Decimal
from itertools import islice
import sys
from textwrap import dedent

from .. import instructions
from ..core import CodeTransformer
from ..patterns import pattern,  matchany, var
from ..utils.instance import instance


IN_COMPREHENSION = 'in_comprehension'


class overloaded_dicts(CodeTransformer):
    """Transformer that allows us to overload dictionary literals.

    This acts by creating an empty map and then inserting every
    key value pair in order.

    The code that is generated will turn something like::

        {k_0: v_0, k_1: v_1, ..., k_n: v_n}

    into::

        _tmp = astype()
        _tmp[k_0] = v_0
        _tmp[k_1] = v_1
        ...
        _tmp[k_n] = v_n
        _tmp  # leaves the map on the stack.

    Parameters
    ----------
    astype : callable
        The constructor for the type to create.

    Examples
    --------
    >>> from collections import OrderedDict
    >>> ordereddict_literals = overloaded_dicts(OrderedDict)
    >>> @ordereddict_literals
    ... def f():
    ...     return {'a': 1, 'b': 2, 'c': 3}
    ...
    >>> f()
    OrderedDict([('a', 1), ('b', 2), ('c', 3)])
    """
    def __init__(self, astype):
        super().__init__()
        self.astype = astype

    @pattern(instructions.BUILD_MAP, matchany[var], instructions.MAP_ADD)
    def _start_comprehension(self, instr, *instrs):
        yield instructions.LOAD_CONST(self.astype).steal(instr)
        # TOS  = self.astype

        yield instructions.CALL_FUNCTION(0)
        # TOS  = m = self.astype()

        yield instructions.STORE_FAST('__map__')

        *body, map_add = instrs
        yield from self.patterndispatcher(body)
        # TOS  = k
        # TOS1 = v

        yield instructions.LOAD_FAST('__map__').steal(map_add)
        # TOS  = __map__
        # TOS1 = k
        # TOS2 = v

        yield instructions.ROT_TWO()
        # TOS  = k
        # TOS1 = __map__
        # TOS2 = v

        yield instructions.STORE_SUBSCR()
        self.begin(IN_COMPREHENSION)

    @pattern(instructions.RETURN_VALUE, startcodes=(IN_COMPREHENSION,))
    def _return_value(self, instr):
        yield instructions.LOAD_FAST('__map__').steal(instr)
        # TOS  = __map__

        yield instr

    if sys.version_info[:2] <= (3, 4):
        # Python 3.4

        @pattern(instructions.BUILD_MAP)
        def _build_map(self, instr):
            yield instructions.LOAD_CONST(self.astype).steal(instr)
            # TOS  = self.astype

            yield instructions.CALL_FUNCTION(0)
            # TOS  = m = self.astype()

            yield from (instructions.DUP_TOP(),) * instr.arg
            # TOS  = m
            # ...
            # TOS[instr.arg] = m

        @pattern(instructions.STORE_MAP)
        def _store_map(self, instr):
            # TOS  = k
            # TOS1 = v
            # TOS2 = m
            # TOS3 = m

            yield instructions.ROT_THREE().steal(instr)
            # TOS  = v
            # TOS1 = m
            # TOS2 = k
            # TOS3 = m

            yield instructions.ROT_THREE()
            # TOS  = m
            # TOS1 = k
            # TOS2 = v
            # TOS3 = m

            yield instructions.ROT_TWO()
            # TOS  = k
            # TOS1 = m
            # TOS2 = v
            # TOS3 = m

            yield instructions.STORE_SUBSCR()
            # TOS  = m

    else:
        # Python 3.5 and beyond!

        def _construct_map(self, key_value_pairs):
            mapping = self.astype()
            for key, value in zip(key_value_pairs[::2], key_value_pairs[1::2]):
                mapping[key] = value
            return mapping

        @pattern(instructions.BUILD_MAP)
        def _build_map(self, instr):
            # TOS      = vn
            # TOS1     = kn
            # ...
            # TOSN     = v0
            # TOSN + 1 = k0
            # Construct a tuple of (k0, v0, k1, v1, ..., kn, vn) for
            # each of the key: value pairs in the dictionary.
            yield instructions.BUILD_TUPLE(instr.arg * 2).steal(instr)
            # TOS  = (k0, v0, k1, v1, ..., kn, vn)

            yield instructions.LOAD_CONST(self._construct_map)
            # TOS  = self._construct_map
            # TOS1 = (k0, v0, k1, v1, ..., kn, vn)

            yield instructions.ROT_TWO()
            # TOS  = (k0, v0, k1, v1, ..., kn, vn)
            # TOS1 = self._construct_
Download .txt
gitextract_cgcyv2vr/

├── .coveragerc
├── .gitattributes
├── .gitignore
├── .travis.yml
├── LICENSE
├── MANIFEST.in
├── README.rst
├── codetransformer/
│   ├── __init__.py
│   ├── _version.py
│   ├── code.py
│   ├── core.py
│   ├── decompiler/
│   │   ├── _343.py
│   │   └── __init__.py
│   ├── instructions.py
│   ├── patterns.py
│   ├── tests/
│   │   ├── __init__.py
│   │   ├── test_code.py
│   │   ├── test_core.py
│   │   ├── test_decompiler.py
│   │   └── test_instructions.py
│   ├── transformers/
│   │   ├── __init__.py
│   │   ├── add2mul.py
│   │   ├── constants.py
│   │   ├── interpolated_strings.py
│   │   ├── literals.py
│   │   ├── pattern_matched_exceptions.py
│   │   ├── precomputed_slices.py
│   │   └── tests/
│   │       ├── __init__.py
│   │       ├── test_add2mul.py
│   │       ├── test_constants.py
│   │       ├── test_exc_patterns.py
│   │       ├── test_interpolated_strings.py
│   │       ├── test_literals.py
│   │       └── test_precomputed_slices.py
│   └── utils/
│       ├── __init__.py
│       ├── functional.py
│       ├── immutable.py
│       ├── instance.py
│       ├── no_default.py
│       ├── pretty.py
│       └── tests/
│           ├── __init__.py
│           ├── test_immutable.py
│           └── test_pretty.py
├── docs/
│   ├── .dir-locals.el
│   ├── Makefile
│   └── source/
│       ├── appendix.rst
│       ├── code-objects.rst
│       ├── conf.py
│       ├── index.rst
│       ├── magics.rst
│       └── patterns.rst
├── requirements_doc.txt
├── setup.cfg
├── setup.py
├── tox.ini
└── versioneer.py
Download .txt
SYMBOL INDEX (524 symbols across 32 files)

FILE: codetransformer/__init__.py
  function load_ipython_extension (line 23) | def load_ipython_extension(ipython):  # pragma: no cover

FILE: codetransformer/_version.py
  function get_keywords (line 18) | def get_keywords():
  class VersioneerConfig (line 29) | class VersioneerConfig:
  function get_config (line 33) | def get_config():
  class NotThisMethod (line 46) | class NotThisMethod(Exception):
  function register_vcs_handler (line 54) | def register_vcs_handler(vcs, method):  # decorator
  function run_command (line 63) | def run_command(commands, args, cwd=None, verbose=False, hide_stderr=Fal...
  function versions_from_parentdir (line 96) | def versions_from_parentdir(parentdir_prefix, root, verbose):
  function git_get_keywords (line 111) | def git_get_keywords(versionfile_abs):
  function git_versions_from_keywords (line 135) | def git_versions_from_keywords(keywords, tag_prefix, verbose):
  function git_pieces_from_vcs (line 180) | def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_comma...
  function plus_or_dot (line 261) | def plus_or_dot(pieces):
  function render_pep440 (line 267) | def render_pep440(pieces):
  function render_pep440_pre (line 291) | def render_pep440_pre(pieces):
  function render_pep440_post (line 307) | def render_pep440_post(pieces):
  function render_pep440_old (line 333) | def render_pep440_old(pieces):
  function render_git_describe (line 353) | def render_git_describe(pieces):
  function render_git_describe_long (line 372) | def render_git_describe_long(pieces):
  function render (line 390) | def render(pieces, style):
  function get_versions (line 419) | def get_versions():

FILE: codetransformer/code.py
  function _sparse_args (line 27) | def _sparse_args(instrs):
  function _sparse_args (line 36) | def _sparse_args(instrs):
  class Flag (line 65) | class Flag(IntEnum):
    class max (line 100) | class max:
      method __get__ (line 103) | def __get__(self, instance, owner):
      method __set__ (line 106) | def __set__(self, instance, value):
    method pack (line 110) | def pack(cls,
    method unpack (line 167) | def unpack(cls, mask):
  function _freevar_argname (line 193) | def _freevar_argname(arg, cellvars, freevars):
  function pycode (line 220) | def pycode(argcount,
  class Code (line 260) | class Code:
    method __init__ (line 321) | def __init__(self,
    method from_pyfunc (line 413) | def from_pyfunc(cls, f):
    method from_pycode (line 429) | def from_pycode(cls, co):
    method to_pycode (line 512) | def to_pycode(self):
    method instrs (line 603) | def instrs(self):
    method sparse_instrs (line 609) | def sparse_instrs(self):
    method argcount (line 618) | def argcount(self):
    method kwonlyargcount (line 626) | def kwonlyargcount(self):
    method consts (line 634) | def consts(self):
    method names (line 647) | def names(self):
    method argnames (line 660) | def argnames(self):
    method varnames (line 669) | def varnames(self):
    method cellvars (line 681) | def cellvars(self):
    method freevars (line 687) | def freevars(self):
    method flags (line 693) | def flags(self):
    method py_flags (line 705) | def py_flags(self):
    method is_nested (line 711) | def is_nested(self):
    method is_generator (line 717) | def is_generator(self):
    method is_coroutine (line 723) | def is_coroutine(self):
    method is_iterable_coroutine (line 731) | def is_iterable_coroutine(self):
    method constructs_new_locals (line 739) | def constructs_new_locals(self):
    method filename (line 749) | def filename(self):
    method name (line 755) | def name(self):
    method firstlineno (line 761) | def firstlineno(self):
    method lnotab (line 768) | def lnotab(self):
    method lno_of_instr (line 774) | def lno_of_instr(self):
    method py_lnotab (line 783) | def py_lnotab(self):
    method stacksize (line 814) | def stacksize(self):
    method index (line 823) | def index(self, instr):
    method bytecode_offset (line 838) | def bytecode_offset(self, instr):
    method __getitem__ (line 853) | def __getitem__(self, key):
    method __iter__ (line 856) | def __iter__(self):
    method __len__ (line 859) | def __len__(self):
    method __contains__ (line 862) | def __contains__(self, instr):
    method dis (line 865) | def dis(self, file=None):

FILE: codetransformer/core.py
  function _a_if_not_none (line 28) | def _a_if_not_none(a, b):
  function _new_lnotab (line 32) | def _new_lnotab(instrs, lnotab):
  class NoContext (line 53) | class NoContext(Exception):
    method __init__ (line 57) | def __init__(self):
  class Context (line 61) | class Context:
    method __init__ (line 64) | def __init__(self, code):
    method __repr__ (line 68) | def __repr__(self):  # pragma: no cover
  class CodeTransformerMeta (line 72) | class CodeTransformerMeta(type):
    method __new__ (line 79) | def __new__(mcls, name, bases, dict_):
    method __prepare__ (line 90) | def __prepare__(self, bases):
  class CodeTransformer (line 94) | class CodeTransformer(metaclass=CodeTransformerMeta):
    method transform_consts (line 104) | def transform_consts(self, consts):
    method _id (line 126) | def _id(self, obj):
    method transform (line 150) | def transform(self, code, *, name=None, filename=None):
    method __call__ (line 206) | def __call__(self, f, *,
    class _context_stack (line 223) | class _context_stack(threading.local):
      method __get__ (line 237) | def __get__(self, instance, owner):
    method _new_context (line 251) | def _new_context(self, code):
    method context (line 259) | def context(self):
    method code (line 273) | def code(self):
    method startcode (line 279) | def startcode(self):
    method begin (line 284) | def begin(self, startcode):

FILE: codetransformer/decompiler/_343.py
  class DecompilationError (line 26) | class DecompilationError(Exception):
  class DecompilationContext (line 30) | class DecompilationContext(immutable,
  class MakeFunctionContext (line 48) | class MakeFunctionContext(immutable):
  function decompile (line 52) | def decompile(f):
  function pycode_to_body (line 100) | def pycode_to_body(co, context):
  function instrs_to_body (line 124) | def instrs_to_body(instrs, context):
  function process_instrs (line 139) | def process_instrs(queue, stack, body, context):
  function _process_instr (line 151) | def _process_instr(instr, queue, stack, body, context):
  function _instr (line 158) | def _instr(instr, queue, stack, body, context):
  function _process_jump (line 166) | def _process_jump(instr, queue, stack, body, context):
  function make_if_statement (line 182) | def make_if_statement(instr, queue, stack, context):
  function _process_instr_extended_arg (line 215) | def _process_instr_extended_arg(instr, queue, stack, body, context):
  function _process_instr_unpack_sequence (line 221) | def _process_instr_unpack_sequence(instr, queue, stack, body, context):
  function _process_instr_import_name (line 226) | def _process_instr_import_name(instr, queue, stack, body, context):
  function _pop_import_LOAD_ATTRs (line 286) | def _pop_import_LOAD_ATTRs(module_name, queue):
  function make_importfrom_alias (line 316) | def make_importfrom_alias(queue, body, context, name):
  function _push (line 375) | def _push(instr, queue, stack, body, context):
  function _make_function (line 385) | def _make_function(instr, queue, stack, body, context):
  function _store (line 409) | def _store(instr, queue, stack, body, context):
  function _dup_top (line 425) | def _dup_top(instr, queue, stack, body, context):
  function make_assignment (line 429) | def make_assignment(instr, queue, stack):
  function make_assign_target (line 451) | def make_assign_target(instr, queue, stack):
  function make_assign_target_store (line 462) | def make_assign_target_store(instr, queue, stack):
  function make_assign_target_setattr (line 467) | def make_assign_target_setattr(instr, queue, stack):
  function make_assign_target_setitem (line 476) | def make_assign_target_setitem(instr, queue, stack):
  function make_assign_target_unpack (line 487) | def make_assign_target_unpack(instr, queue, stack):
  function make_assign_target_load_name (line 500) | def make_assign_target_load_name(instr, queue, stack):
  function _store_subscr (line 510) | def _store_subscr(instr, queue, stack, body, context):
  function _pop (line 517) | def _pop(instr, queue, stack, body, context):
  function _return (line 522) | def _return(instr, queue, stack, body, context):
  function _jump_break_loop (line 539) | def _jump_break_loop(instr, queue, stack, body, context):
  function _jump_absolute (line 546) | def _jump_absolute(instr, queue, stack, body, context):
  function _process_instr_setup_with (line 554) | def _process_instr_setup_with(instr, queue, stack, body, context):
  function pop_with_body_instrs (line 574) | def pop_with_body_instrs(setup_with_instr, queue):
  function make_withitem (line 602) | def make_withitem(queue, stack):
  function _loop (line 627) | def _loop(instr, queue, stack, body, context):
  function make_for_loop (line 636) | def make_for_loop(loop_body_instrs, else_body_instrs, context):
  function make_loop_body_and_orelse (line 672) | def make_loop_body_and_orelse(top_of_loop, body_instrs, else_instrs, con...
  function make_while_loop (line 713) | def make_while_loop(test_and_body_instrs, else_body_instrs, context):
  function make_while_loop_test_expr (line 739) | def make_while_loop_test_expr(loop_body_instrs):
  function pop_loop_instrs (line 784) | def pop_loop_instrs(setup_loop_instr, queue):
  function make_expr (line 842) | def make_expr(stack_builders):
  function _make_expr (line 856) | def _make_expr(toplevel, stack_builders):
  function make_boolop (line 884) | def make_boolop(exprs, op_types):
  function normalize_boolop (line 901) | def normalize_boolop(expr):
  function _make_expr_internal (line 920) | def _make_expr_internal(toplevel, stack_builders):
  function _make_lambda (line 928) | def _make_lambda(toplevel, stack_builders):
  function _make_expr_unary_not (line 977) | def _make_expr_unary_not(toplevel, stack_builders):
  function _make_expr_call_function (line 985) | def _make_expr_call_function(toplevel, stack_builders):
  function _make_expr_call_function_var (line 998) | def _make_expr_call_function_var(toplevel, stack_builders):
  function _make_expr_call_function_kw (line 1012) | def _make_expr_call_function_kw(toplevel, stack_builders):
  function _make_expr_call_function_var_kw (line 1026) | def _make_expr_call_function_var_kw(toplevel, stack_builders):
  function make_call_keywords (line 1040) | def make_call_keywords(stack_builders, count):
  function make_call_positionals (line 1061) | def make_call_positionals(stack_builders, count):
  function _make_expr_tuple (line 1071) | def _make_expr_tuple(toplevel, stack_builders):
  function _make_expr_set (line 1079) | def _make_expr_set(toplevel, stack_builders):
  function _make_expr_list (line 1087) | def _make_expr_list(toplevel, stack_builders):
  function make_exprs (line 1094) | def make_exprs(stack_builders, count):
  function _make_expr_empty_dict (line 1106) | def _make_expr_empty_dict(toplevel, stack_builders):
  function _make_expr_dict (line 1124) | def _make_expr_dict(toplevel, stack_builders):
  function find_build_map (line 1143) | def find_build_map(stack_builders):
  function _make_dict_elems (line 1165) | def _make_dict_elems(build_instr, builders):
  function _make_expr_name (line 1194) | def _make_expr_name(toplevel, stack_builders):
  function _make_expr_attr (line 1199) | def _make_expr_attr(toplevel, stack_builders):
  function _make_expr_getitem (line 1208) | def _make_expr_getitem(toplevel, stack_builders):
  function make_slice (line 1214) | def make_slice(stack_builders):
  function _make_slice (line 1225) | def _make_slice(toplevel, stack_builders):
  function make_slice_build_slice (line 1230) | def make_slice_build_slice(toplevel, stack_builders):
  function make_slice_tuple (line 1235) | def make_slice_tuple(toplevel, stack_builders):
  function normalize_tuple_slice (line 1244) | def normalize_tuple_slice(node):
  function _make_expr_build_slice (line 1265) | def _make_expr_build_slice(toplevel, stack_builders):
  function _make_expr_const (line 1290) | def _make_expr_const(toplevel, stack_builders):
  function _make_const (line 1295) | def _make_const(const):
  function _make_const_number (line 1304) | def _make_const_number(const):
  function _make_const_str (line 1309) | def _make_const_str(const):
  function _make_const_bytes (line 1314) | def _make_const_bytes(const):
  function _make_const_tuple (line 1319) | def _make_const_tuple(const):
  function _make_const_none (line 1324) | def _make_const_none(none):
  function _binop_handler (line 1344) | def _binop_handler(nodetype):
  function make_function (line 1360) | def make_function(function_builders, *, closure):
  function make_function_arguments (line 1433) | def make_function_arguments(args,
  function make_closure_cells (line 1459) | def make_closure_cells(stack_builders):
  function make_global_and_nonlocal_decls (line 1469) | def make_global_and_nonlocal_decls(code_instrs):
  function make_defaults_and_annotations (line 1490) | def make_defaults_and_annotations(make_function_instr, builders):
  function unpack_make_function_arg (line 1530) | def unpack_make_function_arg(arg):
  function _check_make_function_instrs (line 1550) | def _check_make_function_instrs(load_code_instr,
  function pop_arguments (line 1604) | def pop_arguments(instr, stack):
  function _check_stack_for_module_return (line 1630) | def _check_stack_for_module_return(stack):
  function expect (line 1648) | def expect(instr, expected, context):
  function is_lambda_name (line 1661) | def is_lambda_name(name):
  function popwhile (line 1668) | def popwhile(cond, queue, *, side):
  function _current_test (line 1716) | def _current_test():

FILE: codetransformer/decompiler/__init__.py
  function paramnames (line 6) | def paramnames(co):

FILE: codetransformer/instructions.py
  function _notimplemented (line 47) | def _notimplemented(name):
  function _vartype (line 56) | def _vartype(self):
  class InstructionMeta (line 66) | class InstructionMeta(ABCMeta, matchable):
    method __init__ (line 70) | def __init__(self, *args, opcode=None):
    method __new__ (line 73) | def __new__(mcls, name, bases, dict_, *, opcode=None):
    method mcompile (line 118) | def mcompile(self):
    method __repr__ (line 121) | def __repr__(self):
  class Instruction (line 126) | class Instruction(InstructionMeta._marker, metaclass=InstructionMeta):
    method __init__ (line 141) | def __init__(self, arg=_no_arg):
    method __repr__ (line 150) | def __repr__(self):
    method _normalize_arg (line 158) | def _normalize_arg(arg):
    method steal (line 161) | def steal(self, instr):
    method from_opcode (line 189) | def from_opcode(cls, opcode, arg=_no_arg):
    method stack_effect (line 208) | def stack_effect(self):
    method equiv (line 238) | def equiv(self, instr):
  class _RawArg (line 262) | class _RawArg(int):
  function _mk_call_init (line 270) | def _mk_call_init(class_):
  function _call_repr (line 296) | def _call_repr(self):
  function _check_jmp_arg (line 304) | def _check_jmp_arg(self, arg):
  class CompareOpMeta (line 316) | class CompareOpMeta(InstructionMeta):
    class comparator (line 332) | class comparator(IntEnum):
      method __repr__ (line 345) | def __repr__(self):
    class ComparatorDescr (line 350) | class ComparatorDescr:
      method __init__ (line 361) | def __init__(self, op):
      method __get__ (line 364) | def __get__(self, instance, owner):

FILE: codetransformer/patterns.py
  function _prepr (line 14) | def _prepr(m):
  function coerce_ellipsis (line 21) | def coerce_ellipsis(p):
  class matchable (line 30) | class matchable:
    method __or__ (line 33) | def __or__(self, other):
    method __ror__ (line 53) | def __ror__(self, other):
    method __invert__ (line 60) | def __invert__(self):
    method __getitem__ (line 63) | def __getitem__(self, key):
  class postfix_modifier (line 80) | class postfix_modifier(immutable, matchable):
    method mcompile (line 85) | def mcompile(self):
    method __repr__ (line 88) | def __repr__(self):
  class meta (line 93) | class meta(matchable):
    method mcompile (line 96) | def mcompile(self):
    method __repr__ (line 99) | def __repr__(self):
  class modifier (line 104) | class modifier(meta):
  class var (line 111) | class var(modifier):
  class plus (line 118) | class plus(modifier):
  class option (line 125) | class option(modifier):
  class matchrange (line 131) | class matchrange(immutable, meta, defaults={'m': None}):
    method mcompile (line 134) | def mcompile(self):
    method __repr__ (line 144) | def __repr__(self):
  class matchany (line 152) | class matchany(meta):
    method __repr__ (line 157) | def __repr__(self):
  class seq (line 161) | class seq(immutable, matchable):
    method __new__ (line 171) | def __new__(cls, *matchables):
    method __init__ (line 179) | def __init__(self, *matchables):
    method mcompile (line 182) | def mcompile(self):
    method __repr__ (line 185) | def __repr__(self):
  class or_ (line 192) | class or_(immutable, matchable):
    method mcompile (line 202) | def mcompile(self):
    method __repr__ (line 205) | def __repr__(self):
  class not_ (line 209) | class not_(immutable, matchable):
    method mcompile (line 214) | def mcompile(self):
    method __repr__ (line 221) | def __repr__(self):
  class pattern (line 225) | class pattern(immutable):
    method __init__ (line 261) | def __init__(self, *matchables, startcodes=(DEFAULT_STARTCODE,)):
    method __call__ (line 268) | def __call__(self, f):
    method __repr__ (line 271) | def __repr__(self):
  class boundpattern (line 279) | class boundpattern(immutable):
    method __get__ (line 284) | def __get__(self, instance, owner):
    method __call__ (line 294) | def __call__(self, compiled_instrs, instrs, startcode):
  class NoMatch (line 306) | class NoMatch(Exception):
  class patterndispatcher (line 312) | class patterndispatcher(immutable):
    method __get__ (line 317) | def __get__(self, instance, owner):
  class boundpatterndispatcher (line 330) | class boundpatterndispatcher(immutable):
    method _dispatch (line 335) | def _dispatch(self, compiled_instrs, instrs, startcode):
    method __call__ (line 344) | def __call__(self, instrs):

FILE: codetransformer/tests/test_code.py
  function sample_flags (line 14) | def sample_flags(request):
  function test_lnotab_roundtrip (line 26) | def test_lnotab_roundtrip():
  function test_lnotab_really_dumb_whitespace (line 53) | def test_lnotab_really_dumb_whitespace():
  function test_flag_packing (line 68) | def test_flag_packing(sample_flags):
  function test_flag_unpack_too_big (line 73) | def test_flag_unpack_too_big():
  function test_flag_max (line 79) | def test_flag_max():
  function test_flag_max_immutable (line 100) | def test_flag_max_immutable():
  function test_code_multiple_varargs (line 105) | def test_code_multiple_varargs():
  function test_code_multiple_kwargs (line 117) | def test_code_multiple_kwargs():
  function test_dangling_var (line 130) | def test_dangling_var(cls):
  function test_code_flags (line 141) | def test_code_flags(sample_flags):
  function abc_code (line 177) | def abc_code():
  function test_instr_index (line 186) | def test_instr_index(abc_code):
  function test_code_contains (line 196) | def test_code_contains(abc_code):
  function test_code_dis (line 204) | def test_code_dis(capsys):

FILE: codetransformer/tests/test_core.py
  function test_inherit_patterns (line 11) | def test_inherit_patterns():
  function test_override_patterns (line 33) | def test_override_patterns():
  function test_updates_lnotab (line 61) | def test_updates_lnotab():
  function test_context (line 106) | def test_context():
  function test_no_context (line 122) | def test_no_context():

FILE: codetransformer/tests/test_decompiler.py
  function make_indented_body (line 31) | def make_indented_body(body_str):
  function compare (line 43) | def compare(computed, expected):
  function check (line 64) | def check(text, ast_text=None):
  function check_formatted (line 87) | def check_formatted(text, ast_text=None, **fmt_kwargs):
  function test_decompile (line 129) | def test_decompile():
  function test_trivial_expr (line 146) | def test_trivial_expr():
  function test_assign (line 153) | def test_assign(lhs, rhs):
  function test_unpack_to_attribute (line 157) | def test_unpack_to_attribute():
  function test_chained_assign (line 163) | def test_chained_assign():
  function test_unary_not (line 169) | def test_unary_not():
  function test_binary_ops (line 191) | def test_binary_ops(op):
  function test_string_literal (line 198) | def test_string_literal():
  function test_bytes_literal (line 209) | def test_bytes_literal():
  function test_int_literal (line 215) | def test_int_literal():
  function test_float_literal (line 222) | def test_float_literal():
  function test_complex_literal (line 229) | def test_complex_literal():
  function test_tuple_literals (line 236) | def test_tuple_literals():
  function test_set_literals (line 246) | def test_set_literals():
  function test_list_literals (line 252) | def test_list_literals():
  function test_dict_literals (line 259) | def test_dict_literals():
  function test_function_call (line 273) | def test_function_call():
  function test_paramnames (line 290) | def test_paramnames():
  function test_lambda (line 358) | def test_lambda(signature, expr):
  function test_simple_function (line 373) | def test_simple_function():
  function test_annotations (line 384) | def test_annotations():
  function test_function_signatures (line 457) | def test_function_signatures(signature, body):
  function test_decorators (line 468) | def test_decorators():
  function test_store_twice_to_global (line 486) | def test_store_twice_to_global():
  function test_store_twice_to_nonlocal (line 501) | def test_store_twice_to_nonlocal():
  function test_getattr (line 518) | def test_getattr():
  function test_setattr (line 531) | def test_setattr():
  function test_getitem (line 538) | def test_getitem():
  function test_setitem (line 553) | def test_setitem():
  function test_for (line 580) | def test_for(loop, body, else_body):
  function test_while (line 614) | def test_while(condition, body, else_body):
  function test_while_False (line 633) | def test_while_False():
  function test_import (line 647) | def test_import():
  function test_import_from (line 673) | def test_import_from():
  function test_import_star (line 679) | def test_import_star():
  function test_import_attribute_aliasing_module (line 684) | def test_import_attribute_aliasing_module():
  function test_import_in_function (line 688) | def test_import_in_function():
  function test_with_block (line 727) | def test_with_block():
  function test_nested_with (line 786) | def test_nested_with():
  function test_simple_if (line 815) | def test_simple_if():
  function test_if_return (line 827) | def test_if_return():
  function test_if_else (line 840) | def test_if_else():
  function test_if_elif (line 861) | def test_if_elif(last_statement, prefix):
  function test_boolops (line 903) | def test_boolops(op):
  function test_normalize_nested_boolops (line 919) | def test_normalize_nested_boolops(op):
  function test_mixed_boolops (line 934) | def test_mixed_boolops():

FILE: codetransformer/tests/test_instructions.py
  function test_repr_types (line 4) | def test_repr_types():

FILE: codetransformer/transformers/add2mul.py
  class add2mul (line 14) | class add2mul(CodeTransformer):
    method _add2mul (line 16) | def _add2mul(self, add_instr):

FILE: codetransformer/transformers/constants.py
  function _assign_or_del (line 22) | def _assign_or_del(type_):
  class asconstants (line 48) | class asconstants(CodeTransformer):
    method __init__ (line 98) | def __init__(self, *builtin_names, **kwargs):
    method transform (line 112) | def transform(self, code, **kwargs):
    method _load_name (line 121) | def _load_name(self, instr):

FILE: codetransformer/transformers/interpolated_strings.py
  class interpolated_strings (line 18) | class interpolated_strings(CodeTransformer):
    method __init__ (line 43) | def __init__(self, *, transform_bytes=True, transform_str=False):
    method __init__ (line 49) | def __init__(self, *, transform_bytes=True, transform_str=False):
    method types (line 55) | def types(self):
    method _load_const (line 67) | def _load_const(self, instr):
    method _transform_constant_sequence (line 81) | def _transform_constant_sequence(self, seq):
    method transform_stringlike (line 109) | def transform_stringlike(self, const):
    method bytes_instrs (line 120) | def bytes_instrs(self):
    method str_instrs (line 130) | def str_instrs(self):

FILE: codetransformer/transformers/literals.py
  class overloaded_dicts (line 16) | class overloaded_dicts(CodeTransformer):
    method __init__ (line 51) | def __init__(self, astype):
    method _start_comprehension (line 56) | def _start_comprehension(self, instr, *instrs):
    method _return_value (line 84) | def _return_value(self, instr):
    method _build_map (line 94) | def _build_map(self, instr):
    method _store_map (line 107) | def _store_map(self, instr):
    method _construct_map (line 137) | def _construct_map(self, key_value_pairs):
    method _build_map (line 144) | def _build_map(self, instr):
    method _construct_const_map (line 166) | def _construct_const_map(self, values, keys):
    method _build_const_map (line 173) | def _build_const_map(self, keys, instr):
  function _format_constant_docstring (line 197) | def _format_constant_docstring(type_):
  class _ConstantTransformerBase (line 215) | class _ConstantTransformerBase(CodeTransformer):
    method __init__ (line 217) | def __init__(self, xform):
    method transform_consts (line 221) | def transform_consts(self, consts):
  function overloaded_constants (line 237) | def overloaded_constants(type_, __doc__=None):
  function _start_comprehension (line 311) | def _start_comprehension(self, *instrs):
  function _return_value (line 316) | def _return_value(self, instr):
  function _build (line 334) | def _build(self, instr):
  function overloaded_build (line 350) | def overloaded_build(type_, add_name=None):
  function transform_consts (line 420) | def transform_consts(self, consts):
  function transform_consts (line 438) | def transform_consts(self, consts):
  class islice_literals (line 451) | class islice_literals(CodeTransformer):
    method _binary_subscr (line 467) | def _binary_subscr(self, instr):
    method _islicer (line 482) | def _islicer(m, k):

FILE: codetransformer/transformers/pattern_matched_exceptions.py
  function match (line 15) | def match(match_expr, exc_type, exc_value, exc_traceback):
  class pattern_matched_exceptions (line 34) | class pattern_matched_exceptions(CodeTransformer):
    method __init__ (line 69) | def __init__(self, matcher=match):
    method _match (line 76) | def _match(self,
    method _match (line 94) | def _match(self,
    method _compare_op (line 112) | def _compare_op(self, instr):

FILE: codetransformer/transformers/precomputed_slices.py
  class precomputed_slices (line 6) | class precomputed_slices(CodeTransformer):
    method make_constant_slice (line 30) | def make_constant_slice(self, *instrs):

FILE: codetransformer/transformers/tests/test_add2mul.py
  function test_add2mul (line 4) | def test_add2mul():

FILE: codetransformer/transformers/tests/test_constants.py
  function test_global (line 14) | def test_global():
  function test_name (line 23) | def test_name():
  function test_closure (line 39) | def test_closure():
  function test_store (line 52) | def test_store():
  function test_delete (line 65) | def test_delete():
  function test_argname_overlap (line 78) | def test_argname_overlap():

FILE: codetransformer/transformers/tests/test_exc_patterns.py
  function test_patterns (line 5) | def test_patterns():
  function test_patterns_bind_name (line 23) | def test_patterns_bind_name():
  function test_patterns_reraise (line 39) | def test_patterns_reraise():
  function test_normal_exc_match (line 55) | def test_normal_exc_match():
  function test_exc_match_custom_func (line 69) | def test_exc_match_custom_func():

FILE: codetransformer/transformers/tests/test_interpolated_strings.py
  function test_interpolated_bytes (line 14) | def test_interpolated_bytes():
  function test_interpolated_str (line 35) | def test_interpolated_str():
  function test_no_cross_pollination (line 56) | def test_no_cross_pollination():
  function test_string_in_nested_const (line 75) | def test_string_in_nested_const():

FILE: codetransformer/transformers/tests/test_literals.py
  function test_overload_thing_with_thing_is_noop (line 21) | def test_overload_thing_with_thing_is_noop():
  function test_overloaded_dicts (line 30) | def test_overloaded_dicts():
  function test_overloaded_bytes (line 45) | def test_overloaded_bytes():
  function test_overloaded_floats (line 73) | def test_overloaded_floats():
  function test_overloaded_lists (line 98) | def test_overloaded_lists():
  function test_overloaded_strs (line 126) | def test_overloaded_strs():
  function test_overloaded_sets (line 141) | def test_overloaded_sets():
  function test_overloaded_tuples (line 168) | def test_overloaded_tuples():
  function test_overloaded_slices (line 186) | def test_overloaded_slices():
  function test_islice_literals (line 208) | def test_islice_literals():

FILE: codetransformer/transformers/tests/test_precomputed_slices.py
  function test_precomputed_slices (line 7) | def test_precomputed_slices():
  function test_precomputed_slices_non_const (line 22) | def test_precomputed_slices_non_const():

FILE: codetransformer/utils/functional.py
  function is_a (line 11) | def is_a(type_):
  function not_a (line 16) | def not_a(type_):
  function scanl (line 21) | def scanl(f, n, ns):
  function reverse_dict (line 52) | def reverse_dict(d):
  function ffill (line 75) | def ffill(iterable):
  function flatten (line 99) | def flatten(seq, *, recurse_types=(tuple, list, set, frozenset)):

FILE: codetransformer/utils/immutable.py
  class immutableattr (line 15) | class immutableattr:
    method __init__ (line 23) | def __init__(self, attr):
    method __get__ (line 26) | def __get__(self, instance, owner):
  class lazyval (line 30) | class lazyval:
    method __init__ (line 38) | def __init__(self, func):
    method __get__ (line 42) | def __get__(self, instance, owner):
  function _no_arg_init (line 54) | def _no_arg_init(self):
  function initialize_slot (line 61) | def initialize_slot(obj, name, value):
  function _create_init (line 79) | def _create_init(name, slots, defaults):
  function _wrapinit (line 177) | def _wrapinit(init):
  function _check_missing_slots (line 264) | def _check_missing_slots(ob):
  function __setattr__ (line 290) | def __setattr__(self, name, value):
  function __repr__ (line 299) | def __repr__(self):
  class ImmutableMeta (line 309) | class ImmutableMeta(type):
    method __new__ (line 312) | def __new__(mcls, name, bases, dict_, *, defaults=None):
    method __init__ (line 337) | def __init__(self, *args, defaults=None):
  class immutable (line 342) | class immutable(metaclass=ImmutableMeta):
    method to_dict (line 347) | def to_dict(self):
    method update (line 350) | def update(self, **updates):

FILE: codetransformer/utils/instance.py
  function instance (line 1) | def instance(cls):

FILE: codetransformer/utils/no_default.py
  class no_default (line 2) | class no_default:
    method __new__ (line 3) | def __new__(cls):
    method __repr__ (line 6) | def __repr__(self):
    method __reduce__ (line 10) | def __reduce__(self):
    method __deepcopy__ (line 13) | def __deepcopy__(self):

FILE: codetransformer/utils/pretty.py
  function pformat_ast (line 31) | def pformat_ast(node,
  function _extend_name (line 111) | def _extend_name(prev, parent_co):
  function pprint_ast (line 117) | def pprint_ast(node,
  function walk_code (line 149) | def walk_code(co, _prefix=''):
  function iter_attributes (line 164) | def iter_attributes(node):
  function a (line 172) | def a(text, mode='exec', indent='  ', file=None):
  function d (line 193) | def d(obj, mode='exec', file=None):
  function extract_code (line 225) | def extract_code(obj, compile_mode):
  function _ (line 245) | def _(obj, compile_mode):
  function _ (line 250) | def _(obj, compile_mode):
  function display (line 275) | def display(text, mode='exec', file=None):

FILE: codetransformer/utils/tests/test_immutable.py
  class a (line 8) | class a(immutable):
    method spec (line 11) | def spec(__self, a):
  class b (line 15) | class b(immutable):
    method spec (line 18) | def spec(__self, a, b):
  class c (line 22) | class c(immutable):
    method spec (line 25) | def spec(__self, a, b, *c):
  class d (line 29) | class d(immutable):
    method spec (line 32) | def spec(__self, a, b, **c):
  class e (line 36) | class e(immutable):
    method spec (line 39) | def spec(__self, a, b, *, c):
  class f (line 43) | class f(immutable):
    method spec (line 46) | def spec(__self, a, b, *c, d):
  class g (line 50) | class g(immutable, defaults={'a': 1}):
    method spec (line 53) | def spec(__self, a=1):
  class h (line 57) | class h(immutable, defaults={'b': 2}):
    method spec (line 60) | def spec(__self, a, b=2):
  class i (line 64) | class i(immutable, defaults={'a': 1, 'b': 2}):
    method spec (line 67) | def spec(__self, a=1, b=2):
  class j (line 71) | class j(immutable, defaults={'c': 3}):
    method spec (line 74) | def spec(__self, a, b, *, c=3):
  function test_created_signature_single (line 79) | def test_created_signature_single(cls):
  class k (line 83) | class k(immutable):
    method __init__ (line 86) | def __init__(self, a):
  class l (line 90) | class l(immutable):
    method __init__ (line 93) | def __init__(self, *a):
  class m (line 97) | class m(immutable):
    method __init__ (line 100) | def __init__(self, **a):
  class n (line 104) | class n(immutable):
    method __init__ (line 107) | def __init__(self, *, a):
  class o (line 111) | class o(immutable):
    method __init__ (line 114) | def __init__(self, a, b=2):
  class p (line 118) | class p(immutable):
    method __init__ (line 121) | def __init__(self, a=1, b=2):
  class q (line 125) | class q(immutable):
    method __init__ (line 128) | def __init__(self, a, *b):
  class r (line 132) | class r(immutable):
    method __init__ (line 135) | def __init__(self, a=1, *b):
  class s (line 139) | class s(immutable):
    method __init__ (line 142) | def __init__(self, a=1, *b, c):
  class t (line 146) | class t(immutable):
    method __init__ (line 149) | def __init__(self, a, *b, c=3):
  class u (line 153) | class u(immutable):
    method __init__ (line 156) | def __init__(self, a=1, *b, c=3):
  class v (line 160) | class v(immutable):
    method __init__ (line 163) | def __init__(self, a, **b):
  class w (line 167) | class w(immutable):
    method __init__ (line 170) | def __init__(self, a, b, **c):
  class x (line 174) | class x(immutable):
    method __init__ (line 177) | def __init__(self, a, *b, **c):
  class y (line 181) | class y(immutable):
    method __init__ (line 184) | def __init__(self, a, *b, c, **d):
  class z (line 188) | class z(immutable):
    method __init__ (line 191) | def __init__(self, a, *b, c=1, **d):
  function test_preserve_custom_init_signature (line 198) | def test_preserve_custom_init_signature(cls):

FILE: codetransformer/utils/tests/test_pretty.py
  function test_a (line 8) | def test_a(capsys):
  function test_walk_code (line 71) | def test_walk_code():

FILE: versioneer.py
  class VersioneerConfig (line 355) | class VersioneerConfig:
  function get_root (line 359) | def get_root():
  function get_config_from_root (line 393) | def get_config_from_root(root):
  class NotThisMethod (line 419) | class NotThisMethod(Exception):
  function register_vcs_handler (line 427) | def register_vcs_handler(vcs, method):  # decorator
  function run_command (line 436) | def run_command(commands, args, cwd=None, verbose=False, hide_stderr=Fal...
  function git_get_keywords (line 931) | def git_get_keywords(versionfile_abs):
  function git_versions_from_keywords (line 955) | def git_versions_from_keywords(keywords, tag_prefix, verbose):
  function git_pieces_from_vcs (line 1000) | def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_comma...
  function do_vcs_install (line 1081) | def do_vcs_install(manifest_in, versionfile_source, ipy):
  function versions_from_parentdir (line 1114) | def versions_from_parentdir(parentdir_prefix, root, verbose):
  function versions_from_file (line 1146) | def versions_from_file(filename):
  function write_to_version_file (line 1159) | def write_to_version_file(filename, versions):
  function plus_or_dot (line 1169) | def plus_or_dot(pieces):
  function render_pep440 (line 1175) | def render_pep440(pieces):
  function render_pep440_pre (line 1199) | def render_pep440_pre(pieces):
  function render_pep440_post (line 1215) | def render_pep440_post(pieces):
  function render_pep440_old (line 1241) | def render_pep440_old(pieces):
  function render_git_describe (line 1261) | def render_git_describe(pieces):
  function render_git_describe_long (line 1280) | def render_git_describe_long(pieces):
  function render (line 1298) | def render(pieces, style):
  class VersioneerBadRootError (line 1327) | class VersioneerBadRootError(Exception):
  function get_versions (line 1331) | def get_versions(verbose=False):
  function get_version (line 1404) | def get_version():
  function get_cmdclass (line 1408) | def get_cmdclass():
  function do_setup (line 1577) | def do_setup():
  function scan_setup_py (line 1658) | def scan_setup_py():
Condensed preview — 56 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (369K chars).
[
  {
    "path": ".coveragerc",
    "chars": 45,
    "preview": "[run]\nomit =\n    codetransformer/_version.py\n"
  },
  {
    "path": ".gitattributes",
    "chars": 41,
    "preview": "codetransformer/_version.py export-subst\n"
  },
  {
    "path": ".gitignore",
    "chars": 561,
    "preview": ".bundle\ndb/*.sqlite3\nlog/*.log\n*.log\ntmp/**/*\ntmp/*\n*.swp\n*~\n#mac autosaving file\n.DS_Store\n*.py[co]\n\n# Installer logs\np"
  },
  {
    "path": ".travis.yml",
    "chars": 202,
    "preview": "language: python\nsudo: false\npython:\n  - 3.4.3\n  - 3.4\n  - 3.5\n  - 3.6\n\ninstall:\n  - pip install -e .[dev]\n\nscript:\n  - "
  },
  {
    "path": "LICENSE",
    "chars": 18043,
    "preview": "             GNU GENERAL PUBLIC LICENSE\n                Version 2, June 1991\n\n Copyright (C) 1989, 1991 Free Software Fo"
  },
  {
    "path": "MANIFEST.in",
    "chars": 58,
    "preview": "include versioneer.py\ninclude codetransformer/_version.py\n"
  },
  {
    "path": "README.rst",
    "chars": 9854,
    "preview": "``codetransformer``\n===================\n\n|build status| |documentation|\n\nBytecode transformers for CPython inspired by t"
  },
  {
    "path": "codetransformer/__init__.py",
    "chars": 1096,
    "preview": "from .code import Code, Flag\nfrom .core import CodeTransformer\nfrom . patterns import (\n    matchany,\n    not_,\n    opti"
  },
  {
    "path": "codetransformer/_version.py",
    "chars": 15781,
    "preview": "\n# This file helps to compute a version number in source trees obtained from\n# git-archive tarball (such as those provid"
  },
  {
    "path": "codetransformer/code.py",
    "chars": 25539,
    "preview": "from collections import OrderedDict\nfrom dis import Bytecode, dis, findlinestarts\nfrom enum import IntEnum, unique\nfrom "
  },
  {
    "path": "codetransformer/core.py",
    "chars": 8575,
    "preview": "from collections import OrderedDict\nfrom contextlib import contextmanager\nfrom ctypes import py_object, pythonapi\nfrom i"
  },
  {
    "path": "codetransformer/decompiler/_343.py",
    "chars": 52636,
    "preview": "import ast\nfrom collections import deque\nfrom functools import singledispatch\nfrom itertools import takewhile\nimport typ"
  },
  {
    "path": "codetransformer/decompiler/__init__.py",
    "chars": 844,
    "preview": "import sys\n\nfrom ..code import Flag\n\n\ndef paramnames(co):\n    \"\"\"\n    Get the parameter names from a pycode object.\n\n   "
  },
  {
    "path": "codetransformer/instructions.py",
    "chars": 12560,
    "preview": "from abc import ABCMeta, abstractmethod\nfrom dis import opname, opmap, hasjabs, hasjrel, HAVE_ARGUMENT, stack_effect\nfro"
  },
  {
    "path": "codetransformer/patterns.py",
    "chars": 9353,
    "preview": "from operator import methodcaller, index, attrgetter\nimport re\nfrom types import MethodType\n\nfrom .utils.instance import"
  },
  {
    "path": "codetransformer/tests/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "codetransformer/tests/test_code.py",
    "chars": 5761,
    "preview": "from dis import dis\nfrom io import StringIO\nfrom itertools import product, chain\nimport random\nimport sys\n\nimport pytest"
  },
  {
    "path": "codetransformer/tests/test_core.py",
    "chars": 2868,
    "preview": "import pytest\nimport toolz.curried.operator as op\n\nfrom codetransformer import CodeTransformer, Code, pattern\nfrom codet"
  },
  {
    "path": "codetransformer/tests/test_decompiler.py",
    "chars": 19282,
    "preview": "\"\"\"\nTests for decompiler.py\n\"\"\"\nfrom ast import AST, iter_fields, Module, parse\nfrom functools import partial\nfrom itert"
  },
  {
    "path": "codetransformer/tests/test_instructions.py",
    "chars": 205,
    "preview": "from codetransformer.instructions import Instruction\n\n\ndef test_repr_types():\n    assert repr(Instruction) == 'Instructi"
  },
  {
    "path": "codetransformer/transformers/__init__.py",
    "chars": 911,
    "preview": "from .constants import asconstants\nfrom .interpolated_strings import interpolated_strings\nfrom .pattern_matched_exceptio"
  },
  {
    "path": "codetransformer/transformers/add2mul.py",
    "chars": 452,
    "preview": "\"\"\"\nadd2mul\n--------\n\nA transformer that replaces BINARY_ADD instructions with BINARY_MULTIPLY\ninstructions.\n\nThis isn't"
  },
  {
    "path": "codetransformer/transformers/constants.py",
    "chars": 3495,
    "preview": "import builtins\n\nfrom ..core import CodeTransformer\nfrom ..instructions import (\n    DELETE_DEREF,\n    DELETE_FAST,\n    "
  },
  {
    "path": "codetransformer/transformers/interpolated_strings.py",
    "chars": 4041,
    "preview": "\"\"\"\nA transformer implementing ruby-style interpolated strings.\n\"\"\"\nimport sys\n\nfrom codetransformer import pattern, Cod"
  },
  {
    "path": "codetransformer/transformers/literals.py",
    "chars": 13053,
    "preview": "from collections import OrderedDict\nfrom decimal import Decimal\nfrom itertools import islice\nimport sys\nfrom textwrap im"
  },
  {
    "path": "codetransformer/transformers/pattern_matched_exceptions.py",
    "chars": 3626,
    "preview": "import sys\n\nfrom ..core import CodeTransformer\nfrom ..instructions import (\n    BUILD_TUPLE,\n    CALL_FUNCTION,\n    COMP"
  },
  {
    "path": "codetransformer/transformers/precomputed_slices.py",
    "chars": 1374,
    "preview": "from codetransformer.core import CodeTransformer\nfrom codetransformer.instructions import LOAD_CONST, BUILD_SLICE\nfrom c"
  },
  {
    "path": "codetransformer/transformers/tests/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "codetransformer/transformers/tests/test_add2mul.py",
    "chars": 171,
    "preview": "from ..add2mul import add2mul\n\n\ndef test_add2mul():\n\n    @add2mul()\n    def foo(a, b):\n        return (a + b + 2) - 1\n\n "
  },
  {
    "path": "codetransformer/transformers/tests/test_constants.py",
    "chars": 1671,
    "preview": "import os\nfrom sys import _getframe\nfrom types import CodeType\n\nimport pytest\n\nfrom codetransformer.code import Code\nfro"
  },
  {
    "path": "codetransformer/transformers/tests/test_exc_patterns.py",
    "chars": 1780,
    "preview": "from pytest import raises\nfrom ..pattern_matched_exceptions import pattern_matched_exceptions\n\n\ndef test_patterns():\n\n  "
  },
  {
    "path": "codetransformer/transformers/tests/test_interpolated_strings.py",
    "chars": 1993,
    "preview": "import sys\n\nimport pytest\n\nfrom ..interpolated_strings import interpolated_strings\n\n\npytestmark = pytest.mark.skipif(\n  "
  },
  {
    "path": "codetransformer/transformers/tests/test_literals.py",
    "chars": 4872,
    "preview": "\"\"\"\nTests for literal transformers\n\"\"\"\nfrom collections import OrderedDict\nfrom decimal import Decimal\nfrom itertools im"
  },
  {
    "path": "codetransformer/transformers/tests/test_precomputed_slices.py",
    "chars": 941,
    "preview": "from codetransformer.code import Code\nfrom codetransformer.instructions import BUILD_SLICE, LOAD_CONST\n\nfrom ..precomput"
  },
  {
    "path": "codetransformer/utils/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "codetransformer/utils/functional.py",
    "chars": 2557,
    "preview": "\"\"\"\ncodetransformer.utils.functional\n--------------------------------\n\nUtilities for functional programming.\n\"\"\"\n\nfrom t"
  },
  {
    "path": "codetransformer/utils/immutable.py",
    "chars": 9734,
    "preview": "\"\"\"\ncodetransformer.utils.immutable\n-------------------------------\n\nUtilities for creating and working with immutable o"
  },
  {
    "path": "codetransformer/utils/instance.py",
    "chars": 234,
    "preview": "def instance(cls):\n    \"\"\"Decorator for creating one of instances.\n\n    Parameters\n    ----------\n    cls : type\n       "
  },
  {
    "path": "codetransformer/utils/no_default.py",
    "chars": 290,
    "preview": "@object.__new__\nclass no_default:\n    def __new__(cls):\n        return no_default\n\n    def __repr__(self):\n        retur"
  },
  {
    "path": "codetransformer/utils/pretty.py",
    "chars": 8088,
    "preview": "\"\"\"\ncodetransformer.utils.pretty\n----------------------------\n\nUtilities for pretty-printing ASTs and code objects.\n\"\"\"\n"
  },
  {
    "path": "codetransformer/utils/tests/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "codetransformer/utils/tests/test_immutable.py",
    "chars": 3059,
    "preview": "from inspect import getfullargspec\n\nimport pytest\n\nfrom codetransformer.utils.immutable import immutable\n\n\nclass a(immut"
  },
  {
    "path": "codetransformer/utils/tests/test_pretty.py",
    "chars": 2452,
    "preview": "from io import StringIO\nfrom textwrap import dedent\nfrom types import CodeType\n\nfrom ..pretty import a, walk_code\n\n\ndef "
  },
  {
    "path": "docs/.dir-locals.el",
    "chars": 437,
    "preview": ";; Set compile-commnd for everything in this directory to\n;; \"make -C <this-directory> html\"\n\n;; This is an association "
  },
  {
    "path": "docs/Makefile",
    "chars": 7533,
    "preview": "# Makefile for Sphinx documentation\n#\n\n# You can set these variables from the command line.\nSPHINXOPTS    =\nSPHINXBUILD "
  },
  {
    "path": "docs/source/appendix.rst",
    "chars": 1931,
    "preview": "API Reference\n=============\n\n``codetransformer.transformers``\n--------------------------------\n\n.. automodule:: codetran"
  },
  {
    "path": "docs/source/code-objects.rst",
    "chars": 4980,
    "preview": "===========================\n Working with Code Objects\n===========================\n\nThe :class:`~codetransformer.code.Co"
  },
  {
    "path": "docs/source/conf.py",
    "chars": 9754,
    "preview": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n#\n# codetransformer documentation build configuration file, created by\n# "
  },
  {
    "path": "docs/source/index.rst",
    "chars": 2007,
    "preview": "codetransformer\n===============\n\nBytecode transformers for CPython inspired by the ``ast`` module's\n``NodeTransformer``."
  },
  {
    "path": "docs/source/magics.rst",
    "chars": 2307,
    "preview": "Interactive Conveniences\n========================\n\nWhen developing projects using :mod:`codetransformer`, it's often hel"
  },
  {
    "path": "docs/source/patterns.rst",
    "chars": 7984,
    "preview": "============\n Pattern API\n============\n\nMost bytecode transformations are best expressed by identifying a pattern in the"
  },
  {
    "path": "requirements_doc.txt",
    "chars": 38,
    "preview": "Sphinx==1.3.5\nsphinx-rtd-theme==0.1.9\n"
  },
  {
    "path": "setup.cfg",
    "chars": 341,
    "preview": "# See the docstring in versioneer.py for instructions. Note that you must\n# re-run 'versioneer.py setup' after changing "
  },
  {
    "path": "setup.py",
    "chars": 1294,
    "preview": "#!/usr/bin/env python\nfrom setuptools import setup, find_packages\nimport sys\n\nimport versioneer\n\nlong_description = ''\n\n"
  },
  {
    "path": "tox.ini",
    "chars": 276,
    "preview": "[tox]\nenvlist=py{34,35,36}\nskip_missing_interpreters=True\n\n[testenv]\ncommands=\n    pip install -e .[dev]\n    py.test\n\n[p"
  },
  {
    "path": "versioneer.py",
    "chars": 62474,
    "preview": "\n# Version: 0.15\n\n\"\"\"\nThe Versioneer\n==============\n\n* like a rocketeer, but for versions!\n* https://github.com/warner/p"
  }
]

About this extraction

This page contains the full source code of the llllllllll/codetransformer GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 56 files (341.3 KB), approximately 86.0k tokens, and a symbol index with 524 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!