Full Code of python/typed_ast for AI

master 567af09d8fc0 cached
95 files
1.5 MB
409.2k tokens
1243 symbols
1 requests
Download .txt
Showing preview only (1,551K chars total). Download the full file or copy to clipboard to get everything.
Repository: python/typed_ast
Branch: master
Commit: 567af09d8fc0
Files: 95
Total size: 1.5 MB

Directory structure:
gitextract_xd1s015c/

├── .gitattributes
├── .github/
│   └── workflows/
│       └── build.yml
├── .gitignore
├── CONTRIBUTING.md
├── LICENSE
├── MANIFEST.in
├── README.md
├── ast27/
│   ├── Custom/
│   │   └── typed_ast.c
│   ├── Grammar/
│   │   └── Grammar
│   ├── Include/
│   │   ├── Python-ast.h
│   │   ├── asdl.h
│   │   ├── ast.h
│   │   ├── bitset.h
│   │   ├── compile.h
│   │   ├── errcode.h
│   │   ├── graminit.h
│   │   ├── grammar.h
│   │   ├── node.h
│   │   ├── parsetok.h
│   │   ├── pgenheaders.h
│   │   ├── pyarena.h
│   │   ├── pycore_pyarena.h
│   │   └── token.h
│   ├── Parser/
│   │   ├── Python.asdl
│   │   ├── acceler.c
│   │   ├── asdl.py
│   │   ├── asdl_c.py
│   │   ├── bitset.c
│   │   ├── grammar.c
│   │   ├── grammar1.c
│   │   ├── node.c
│   │   ├── parser.c
│   │   ├── parser.h
│   │   ├── parsetok.c
│   │   ├── spark.py
│   │   ├── tokenizer.c
│   │   └── tokenizer.h
│   └── Python/
│       ├── Python-ast.c
│       ├── asdl.c
│       ├── ast.c
│       ├── graminit.c
│       └── mystrtoul.c
├── ast3/
│   ├── Custom/
│   │   └── typed_ast.c
│   ├── Grammar/
│   │   └── Grammar
│   ├── Include/
│   │   ├── Python-ast.h
│   │   ├── asdl.h
│   │   ├── ast.h
│   │   ├── bitset.h
│   │   ├── errcode.h
│   │   ├── graminit.h
│   │   ├── grammar.h
│   │   ├── node.h
│   │   ├── parsetok.h
│   │   ├── pgenheaders.h
│   │   ├── pyarena.h
│   │   ├── pycore_pyarena.h
│   │   └── token.h
│   ├── Parser/
│   │   ├── Python.asdl
│   │   ├── acceler.c
│   │   ├── asdl.py
│   │   ├── asdl_c.py
│   │   ├── bitset.c
│   │   ├── grammar.c
│   │   ├── grammar1.c
│   │   ├── node.c
│   │   ├── parser.c
│   │   ├── parser.h
│   │   ├── parsetok.c
│   │   ├── tokenizer.c
│   │   └── tokenizer.h
│   ├── Python/
│   │   ├── Python-ast.c
│   │   ├── asdl.c
│   │   ├── ast.c
│   │   └── graminit.c
│   └── tests/
│       └── test_basics.py
├── release_process.md
├── setup.py
├── tools/
│   ├── Grammar.patch
│   ├── Python-asdl.patch
│   ├── asdl_c.patch
│   ├── ast.patch
│   ├── find_exported_symbols
│   ├── parsetok.patch
│   ├── script
│   ├── token.patch
│   ├── tokenizer.patch
│   ├── update_ast27_asdl
│   ├── update_ast3_asdl
│   ├── update_ast3_grammar
│   ├── update_exported_symbols
│   └── update_header_guards
└── typed_ast/
    ├── __init__.py
    ├── ast27.py
    ├── ast3.py
    └── conversions.py

================================================
FILE CONTENTS
================================================

================================================
FILE: .gitattributes
================================================
# Generated files
# https://github.com/github/linguist#generated-code
ast3/Include/graminit.h          linguist-generated=true
ast3/Python/graminit.h           linguist-generated=true
ast3/Include/Python-ast.h        linguist-generated=true
ast3/Python/Python-ast.c         linguist-generated=true
ast3/Include/token.h             linguist-generated=true
ast3/Lib/token.py                linguist-generated=true
ast3/Parser/token.c              linguist-generated=true
ast27/Include/graminit.h         linguist-generated=true
ast27/Python/graminit.h          linguist-generated=true
ast27/Include/Python-ast.h       linguist-generated=true
ast27/Python/Python-ast.c        linguist-generated=true
ast27/Include/token.h            linguist-generated=true
ast27/Lib/token.py               linguist-generated=true
ast27/Parser/token.c             linguist-generated=true


================================================
FILE: .github/workflows/build.yml
================================================
name: Build wheels

on: [push, pull_request, workflow_dispatch]

jobs:
  build_wheels:
    name: py${{ matrix.python-version }} on ${{ matrix.os }}
    runs-on: ${{ matrix.os }}
    strategy:
      fail-fast: false
      matrix:
        # cibuildwheel builds linux wheels inside a manylinux container
        # it also takes care of procuring the correct python version for us
        os: [ubuntu-latest, windows-latest, macos-latest]
        python-version: [36, 37, 38, 39, 310, 311]

    steps:
      - uses: actions/checkout@v3
      - name: Build wheels
        uses: pypa/cibuildwheel@v2.10.2
        env:
          CIBW_BUILD: "cp${{ matrix.python-version }}-*"
          CIBW_SKIP: "*-manylinux_i686 *-musllinux_i686 *-win32"
          CIBW_ARCHS_MACOS: "x86_64 arm64"
          CIBW_BUILD_VERBOSITY: 1
          CIBW_BEFORE_TEST: pip install pytest
          CIBW_TEST_COMMAND: pytest {package}
      - uses: actions/upload-artifact@v3
        with:
          name: dist
          path: ./wheelhouse/*.whl

  build_wheels_aarch64:
    name: py${{ matrix.python-version }} on ${{ matrix.os }} (aarch64)
    runs-on: ${{ matrix.os }}
    strategy:
      fail-fast: false
      matrix:
        # cibuildwheel builds linux wheels inside a manylinux container
        # it also takes care of procuring the correct python version for us
        os: [ubuntu-latest]
        python-version: [36, 37, 38, 39, 310, 311]

    steps:
      - uses: actions/checkout@v3
      - name: Setup up QEMU
        uses: docker/setup-qemu-action@v2
        with:
          platforms: arm64
      - name: Build wheels
        uses: pypa/cibuildwheel@v2.10.2
        env:
          CIBW_BUILD: "cp${{ matrix.python-version }}-*"
          CIBW_ARCHS: aarch64
          CIBW_BUILD_VERBOSITY: 1
          CIBW_BEFORE_TEST: pip install pytest
          CIBW_TEST_COMMAND: pytest {package}
      - uses: actions/upload-artifact@v3
        with:
          name: dist
          path: ./wheelhouse/*.whl

  build_sdist_python_wheel:
    name: sdist and python wheel
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - uses: actions/setup-python@v4
        name: Install Python
        with:
          python-version: "3.9"
      - name: Run check-manifest
        run: |
          pip install check-manifest
          check-manifest -v
      - name: Build sdist and wheel
        run: |
          pip install --upgrade setuptools pip wheel
          python setup.py sdist
      - uses: actions/upload-artifact@v3
        with:
          name: dist
          path: |
            dist/*.tar.gz


================================================
FILE: .gitignore
================================================
*.o
*.pyc
/build/
__pycache__/
.DS_Store
/tools/pgen3
/.pytest_cache/
/typed_ast.egg-info/


================================================
FILE: CONTRIBUTING.md
================================================
To contribute code to this project, you'll need to sign the [Python Software Foundation's Contributor License Agreement](https://www.python.org/psf/contrib/contrib-form/).


================================================
FILE: LICENSE
================================================
Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/
Upstream-Name: typed-ast
Source: https://pypi.python.org/pypi/typed-ast

Files: *
Copyright: © 2016 David Fisher <ddfisher@dropbox.com>
License: Apache-2.0

Files: *
Copyright: © 2016 David Fisher <ddfisher@dropbox.com>
           © 2008 Armin Ronacher
Comment: The original CPython source is licensed under the
 Python Software Foundation License Version 2
License: Python

Files: ast27/Parser/spark.py
Copyright: © 1998-2002 John Aycock
License: Expat
 Permission is hereby granted, free of charge, to any person obtaining
 a copy of this software and associated documentation files (the
 "Software"), to deal in the Software without restriction, including
 without limitation the rights to use, copy, modify, merge, publish,
 distribute, sublicense, and/or sell copies of the Software, and to
 permit persons to whom the Software is furnished to do so, subject to
 the following conditions:

 The above copyright notice and this permission notice shall be
 included in all copies or substantial portions of the Software.

 THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
 EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
 MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
 IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
 CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
 TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
 SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.


License: Apache-2.0
                               Apache License
                         Version 2.0, January 2004
                      http://www.apache.org/licenses/
 .
 TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
 .
 1. Definitions.
 .
    "License" shall mean the terms and conditions for use, reproduction,
    and distribution as defined by Sections 1 through 9 of this document.
 .
    "Licensor" shall mean the copyright owner or entity authorized by
    the copyright owner that is granting the License.
 .
    "Legal Entity" shall mean the union of the acting entity and all
    other entities that control, are controlled by, or are under common
    control with that entity. For the purposes of this definition,
    "control" means (i) the power, direct or indirect, to cause the
    direction or management of such entity, whether by contract or
    otherwise, or (ii) ownership of fifty percent (50%) or more of the
    outstanding shares, or (iii) beneficial ownership of such entity.
 .
    "You" (or "Your") shall mean an individual or Legal Entity
    exercising permissions granted by this License.
 .
    "Source" form shall mean the preferred form for making modifications,
    including but not limited to software source code, documentation
    source, and configuration files.
 .
    "Object" form shall mean any form resulting from mechanical
    transformation or translation of a Source form, including but
    not limited to compiled object code, generated documentation,
    and conversions to other media types.
 .
    "Work" shall mean the work of authorship, whether in Source or
    Object form, made available under the License, as indicated by a
    copyright notice that is included in or attached to the work
    (an example is provided in the Appendix below).
 .
    "Derivative Works" shall mean any work, whether in Source or Object
    form, that is based on (or derived from) the Work and for which the
    editorial revisions, annotations, elaborations, or other modifications
    represent, as a whole, an original work of authorship. For the purposes
    of this License, Derivative Works shall not include works that remain
    separable from, or merely link (or bind by name) to the interfaces of,
    the Work and Derivative Works thereof.
 .
    "Contribution" shall mean any work of authorship, including
    the original version of the Work and any modifications or additions
    to that Work or Derivative Works thereof, that is intentionally
    submitted to Licensor for inclusion in the Work by the copyright owner
    or by an individual or Legal Entity authorized to submit on behalf of
    the copyright owner. For the purposes of this definition, "submitted"
    means any form of electronic, verbal, or written communication sent
    to the Licensor or its representatives, including but not limited to
    communication on electronic mailing lists, source code control systems,
    and issue tracking systems that are managed by, or on behalf of, the
    Licensor for the purpose of discussing and improving the Work, but
    excluding communication that is conspicuously marked or otherwise
    designated in writing by the copyright owner as "Not a Contribution."
 .
    "Contributor" shall mean Licensor and any individual or Legal Entity
    on behalf of whom a Contribution has been received by Licensor and
    subsequently incorporated within the Work.
 .
 2. Grant of Copyright License. Subject to the terms and conditions of
    this License, each Contributor hereby grants to You a perpetual,
    worldwide, non-exclusive, no-charge, royalty-free, irrevocable
    copyright license to reproduce, prepare Derivative Works of,
    publicly display, publicly perform, sublicense, and distribute the
    Work and such Derivative Works in Source or Object form.
 .
 3. Grant of Patent License. Subject to the terms and conditions of
    this License, each Contributor hereby grants to You a perpetual,
    worldwide, non-exclusive, no-charge, royalty-free, irrevocable
    (except as stated in this section) patent license to make, have made,
    use, offer to sell, sell, import, and otherwise transfer the Work,
    where such license applies only to those patent claims licensable
    by such Contributor that are necessarily infringed by their
    Contribution(s) alone or by combination of their Contribution(s)
    with the Work to which such Contribution(s) was submitted. If You
    institute patent litigation against any entity (including a
    cross-claim or counterclaim in a lawsuit) alleging that the Work
    or a Contribution incorporated within the Work constitutes direct
    or contributory patent infringement, then any patent licenses
    granted to You under this License for that Work shall terminate
    as of the date such litigation is filed.
 .
 4. Redistribution. You may reproduce and distribute copies of the
    Work or Derivative Works thereof in any medium, with or without
    modifications, and in Source or Object form, provided that You
    meet the following conditions:
 .
    (a) You must give any other recipients of the Work or
        Derivative Works a copy of this License; and
 .
    (b) You must cause any modified files to carry prominent notices
        stating that You changed the files; and
 .
    (c) You must retain, in the Source form of any Derivative Works
        that You distribute, all copyright, patent, trademark, and
        attribution notices from the Source form of the Work,
        excluding those notices that do not pertain to any part of
        the Derivative Works; and
 .
    (d) If the Work includes a "NOTICE" text file as part of its
        distribution, then any Derivative Works that You distribute must
        include a readable copy of the attribution notices contained
        within such NOTICE file, excluding those notices that do not
        pertain to any part of the Derivative Works, in at least one
        of the following places: within a NOTICE text file distributed
        as part of the Derivative Works; within the Source form or
        documentation, if provided along with the Derivative Works; or,
        within a display generated by the Derivative Works, if and
        wherever such third-party notices normally appear. The contents
        of the NOTICE file are for informational purposes only and
        do not modify the License. You may add Your own attribution
        notices within Derivative Works that You distribute, alongside
        or as an addendum to the NOTICE text from the Work, provided
        that such additional attribution notices cannot be construed
        as modifying the License.
 .
    You may add Your own copyright statement to Your modifications and
    may provide additional or different license terms and conditions
    for use, reproduction, or distribution of Your modifications, or
    for any such Derivative Works as a whole, provided Your use,
    reproduction, and distribution of the Work otherwise complies with
    the conditions stated in this License.
 .
 5. Submission of Contributions. Unless You explicitly state otherwise,
    any Contribution intentionally submitted for inclusion in the Work
    by You to the Licensor shall be under the terms and conditions of
    this License, without any additional terms or conditions.
    Notwithstanding the above, nothing herein shall supersede or modify
    the terms of any separate license agreement you may have executed
    with Licensor regarding such Contributions.
 .
 6. Trademarks. This License does not grant permission to use the trade
    names, trademarks, service marks, or product names of the Licensor,
    except as required for reasonable and customary use in describing the
    origin of the Work and reproducing the content of the NOTICE file.
 .
 7. Disclaimer of Warranty. Unless required by applicable law or
    agreed to in writing, Licensor provides the Work (and each
    Contributor provides its Contributions) on an "AS IS" BASIS,
    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
    implied, including, without limitation, any warranties or conditions
    of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
    PARTICULAR PURPOSE. You are solely responsible for determining the
    appropriateness of using or redistributing the Work and assume any
    risks associated with Your exercise of permissions under this License.
 .
 8. Limitation of Liability. In no event and under no legal theory,
    whether in tort (including negligence), contract, or otherwise,
    unless required by applicable law (such as deliberate and grossly
    negligent acts) or agreed to in writing, shall any Contributor be
    liable to You for damages, including any direct, indirect, special,
    incidental, or consequential damages of any character arising as a
    result of this License or out of the use or inability to use the
    Work (including but not limited to damages for loss of goodwill,
    work stoppage, computer failure or malfunction, or any and all
    other commercial damages or losses), even if such Contributor
    has been advised of the possibility of such damages.
 .
 9. Accepting Warranty or Additional Liability. While redistributing
    the Work or Derivative Works thereof, You may choose to offer,
    and charge a fee for, acceptance of support, warranty, indemnity,
    or other liability obligations and/or rights consistent with this
    License. However, in accepting such obligations, You may act only
    on Your own behalf and on Your sole responsibility, not on behalf
    of any other Contributor, and only if You agree to indemnify,
    defend, and hold each Contributor harmless for any liability
    incurred by, or claims asserted against, such Contributor by reason
    of your accepting any such warranty or additional liability.
 .
 END OF TERMS AND CONDITIONS
 .
 APPENDIX: How to apply the Apache License to your work.
 .
    To apply the Apache License to your work, attach the following
    boilerplate notice, with the fields enclosed by brackets "[]"
    replaced with your own identifying information. (Don't include
    the brackets!)  The text should be enclosed in the appropriate
    comment syntax for the file format. We also recommend that a
    file or class name and description of purpose be included on the
    same "printed page" as the copyright notice for easier
    identification within third-party archives.
 .
 Copyright 2016 Dropbox, Inc.
 .
 Licensed under the Apache License, Version 2.0 (the "License");
 you may not use this file except in compliance with the License.
 You may obtain a copy of the License at
 .
     http://www.apache.org/licenses/LICENSE-2.0
 .
 Unless required by applicable law or agreed to in writing, software
 distributed under the License is distributed on an "AS IS" BASIS,
 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 See the License for the specific language governing permissions and
 limitations under the License.
 
License: Python
 PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
 --------------------------------------------
 .
 1. This LICENSE AGREEMENT is between the Python Software Foundation
 ("PSF"), and the Individual or Organization ("Licensee") accessing and
 otherwise using this software ("Python") in source or binary form and
 its associated documentation.
 .
 2. Subject to the terms and conditions of this License Agreement, PSF hereby
 grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,
 analyze, test, perform and/or display publicly, prepare derivative works,
 distribute, and otherwise use Python alone or in any derivative version,
 provided, however, that PSF's License Agreement and PSF's notice of copyright,
 i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
 2011, 2012, 2013, 2014, 2015, 2016 Python Software Foundation; All Rights
 Reserved" are retained in Python alone or in any derivative version prepared by
 Licensee.
 .
 3. In the event Licensee prepares a derivative work that is based on
 or incorporates Python or any part thereof, and wants to make
 the derivative work available to others as provided herein, then
 Licensee hereby agrees to include in any such work a brief summary of
 the changes made to Python.
 .
 4. PSF is making Python available to Licensee on an "AS IS"
 basis.  PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
 IMPLIED.  BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
 DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
 FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
 INFRINGE ANY THIRD PARTY RIGHTS.
 .
 5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
 FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
 A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
 OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
 .
 6. This License Agreement will automatically terminate upon a material
 breach of its terms and conditions.
 .
 7. Nothing in this License Agreement shall be deemed to create any
 relationship of agency, partnership, or joint venture between PSF and
 Licensee.  This License Agreement does not grant permission to use PSF
 trademarks or trade name in a trademark sense to endorse or promote
 products or services of Licensee, or any third party.
 .
 8. By copying, installing or otherwise using Python, Licensee
 agrees to be bound by the terms and conditions of this License
 Agreement.


================================================
FILE: MANIFEST.in
================================================
include ast27/Grammar/Grammar
include ast27/Parser/Python.asdl
recursive-include ast27 *.h
recursive-include ast27 *.py

include ast3/Grammar/Grammar
include ast3/Parser/Python.asdl
recursive-include ast3 *.h
recursive-include ast3 *.py

recursive-include ast3/tests *.py
include LICENSE

prune tools
exclude CONTRIBUTING.md
exclude release_process.md
exclude update_process.md


================================================
FILE: README.md
================================================
# End of life

This project is no longer maintained.

Use the standard library `ast` module instead.
See https://github.com/python/typed_ast/issues/179.

# Typed AST

[![Build Status](https://travis-ci.org/python/typed_ast.svg?branch=master)](https://travis-ci.org/python/typed_ast)
[![Chat at https://gitter.im/python/typed_ast](https://badges.gitter.im/python/typed_ast.svg)](https://gitter.im/python/typed_ast)

`typed_ast` is a Python 3 package that provides a Python 2.7 and Python 3
parser similar to the standard `ast` library.  Unlike `ast` up to Python 3.7, the parsers in
`typed_ast` include [PEP 484](https://www.python.org/dev/peps/pep-0484/) type
comments and are independent of the version of Python under which they are run.
The `typed_ast` parsers produce the standard Python AST (plus type comments),
and are both fast and correct, as they are based on the CPython 2.7 and 3.7
parsers.  `typed_ast` runs on CPython 3.6-3.10 on Linux, OS X and Windows.

**Note:** Starting with Python 3.8, we recommend to use the native `ast` parser
(see below).

## Development Philosophy

This project is a (mostly) drop-in replacement for the builtin `ast` module.  It is
intended to be bug-for-bug compatible and behave identically, except for the
presence of a few additional fields on the returned classes and a few
additional optional arguments to the `parse` call.  Therefore, `typed_ast` will
not accept any bugfixes for bugs in `ast` -- they should be fixed upstream
instead.  To avoid feature bloat, any new features for `typed_ast` should have
the potential to be broadly useful and not be built just for one niche usecase
or in a manner such that only one project can use them.

### Incompatibilities

For the purposes of *consuming* syntax trees, this should be a drop-in replacement.
It is not a drop-in replacement for users that wish to create or transform ASTs,
as a number of syntax tree classes have additional fields that must be populated
when constructing them.

Due to reliance on certain C APIs, this library does not build on and there
are [no plans to support PyPy](https://github.com/python/typed_ast/issues/111).

### Python 3.8

`typed_ast` will not be updated to support parsing Python 3.8 and
newer.  Instead, it is recommended to use the stdlib `ast` module
there, which has been augmented to support extracting type comments
and has limited support for parsing older versions of Python 3.

## Submodules
### ast3
The `ast3` parser produces the AST from a Python 3 code, up to Python 3.7. 
(For rationale and technical
details, see [here](update_process.md).)  The AST it currently produces is described in
[ast3/Parser/Python.asdl](ast3/Parser/Python.asdl).  If you wish to limit
parsing to older versions of Python 3, `ast3` can be configured to to give a
SyntaxError for new syntax features introduced beyond a given Python version.
For more information, see the module docstring in
[typed\_ast/ast3.py](typed_ast/ast3.py).

### ast27
The `ast27` parser tracks the standard Python 2.7 AST, which is expected to
never receive further updates. The AST it produces is described in
[ast27/Parser/Python.asdl](ast27/Parser/Python.asdl).  For more information,
see the module docstring in [typed\_ast/ast27.py](typed_ast/ast27.py).

### conversions
`typed_ast` also provides a `conversions` module which converts `ast27` ASTs
into `ast3` ASTs.  This functionality is somewhat experimental, however.  For
more information, see the `py2to3` docstring in
[typed\_ast/conversions](typed_ast/conversions.py).


Note: as these parsers consider type comments part of the grammar, incorrectly
placed type comments are considered syntax errors.

## Releases

To make a new `typed_ast` release, see [`release_process.md`](release_process.md).


================================================
FILE: ast27/Custom/typed_ast.c
================================================
#include "Python.h"
#include "../Include/Python-ast.h"
#include "../Include/compile.h"
#include "../Include/node.h"
#include "../Include/grammar.h"
#include "../Include/token.h"
#include "../Include/ast.h"
#include "../Include/parsetok.h"
#include "../Include/errcode.h"
#include "../Include/graminit.h"

extern grammar _Ta27Parser_Grammar; /* from graminit.c */

// from Python/bltinmodule.c
static const char *
source_as_string(PyObject *cmd, const char *funcname, const char *what, PyCompilerFlags *cf, PyObject **cmd_copy)
{
    const char *str;
    Py_ssize_t size;
    Py_buffer view;

    *cmd_copy = NULL;
    if (PyUnicode_Check(cmd)) {
        cf->cf_flags |= PyCF_IGNORE_COOKIE;
        str = PyUnicode_AsUTF8AndSize(cmd, &size);
        if (str == NULL)
            return NULL;
    }
    else if (PyBytes_Check(cmd)) {
        str = PyBytes_AS_STRING(cmd);
        size = PyBytes_GET_SIZE(cmd);
    }
    else if (PyByteArray_Check(cmd)) {
        str = PyByteArray_AS_STRING(cmd);
        size = PyByteArray_GET_SIZE(cmd);
    }
    else if (PyObject_GetBuffer(cmd, &view, PyBUF_SIMPLE) == 0) {
        /* Copy to NUL-terminated buffer. */
        *cmd_copy = PyBytes_FromStringAndSize(
            (const char *)view.buf, view.len);
        PyBuffer_Release(&view);
        if (*cmd_copy == NULL) {
            return NULL;
        }
        str = PyBytes_AS_STRING(*cmd_copy);
        size = PyBytes_GET_SIZE(*cmd_copy);
    }
    else {
        PyErr_Format(PyExc_TypeError,
          "%s() arg 1 must be a %s object",
          funcname, what);
        return NULL;
    }

    if (strlen(str) != (size_t)size) {
        PyErr_SetString(PyExc_ValueError,
                        "source code string cannot contain null bytes");
        Py_CLEAR(*cmd_copy);
        return NULL;
    }
    return str;
}

// from Python/pythonrun.c
/* compute parser flags based on compiler flags */
static int PARSER_FLAGS(PyCompilerFlags *flags)
{
    int parser_flags = 0;
    if (!flags)
        return 0;
    if (flags->cf_flags & PyCF_DONT_IMPLY_DEDENT)
        parser_flags |= PyPARSE_DONT_IMPLY_DEDENT;
    if (flags->cf_flags & PyCF_IGNORE_COOKIE)
        parser_flags |= PyPARSE_IGNORE_COOKIE;
    return parser_flags;
}

// from Python/pythonrun.c
/* Set the error appropriate to the given input error code (see errcode.h) */
static void
err_input(perrdetail *err)
{
    PyObject *v, *w, *errtype, *errtext;
    PyObject *msg_obj = NULL;
    char *msg = NULL;
    int offset = err->offset;

    errtype = PyExc_SyntaxError;
    switch (err->error) {
    case E_ERROR:
        return;
    case E_SYNTAX:
        errtype = PyExc_IndentationError;
        if (err->expected == INDENT)
            msg = "expected an indented block";
        else if (err->token == INDENT)
            msg = "unexpected indent";
        else if (err->token == DEDENT)
            msg = "unexpected unindent";
        else {
            errtype = PyExc_SyntaxError;
            if (err->token == TYPE_COMMENT)
              msg = "misplaced type annotation";
            else
              msg = "invalid syntax";
        }
        break;
    case E_TOKEN:
        msg = "invalid token";
        break;
    case E_EOFS:
        msg = "EOF while scanning triple-quoted string literal";
        break;
    case E_EOLS:
        msg = "EOL while scanning string literal";
        break;
    case E_INTR:
        if (!PyErr_Occurred())
            PyErr_SetNone(PyExc_KeyboardInterrupt);
        goto cleanup;
    case E_NOMEM:
        PyErr_NoMemory();
        goto cleanup;
    case E_EOF:
        msg = "unexpected EOF while parsing";
        break;
    case E_TABSPACE:
        errtype = PyExc_TabError;
        msg = "inconsistent use of tabs and spaces in indentation";
        break;
    case E_OVERFLOW:
        msg = "expression too long";
        break;
    case E_DEDENT:
        errtype = PyExc_IndentationError;
        msg = "unindent does not match any outer indentation level";
        break;
    case E_TOODEEP:
        errtype = PyExc_IndentationError;
        msg = "too many levels of indentation";
        break;
    case E_DECODE: {
        PyObject *type, *value, *tb;
        PyErr_Fetch(&type, &value, &tb);
        msg = "unknown decode error";
        if (value != NULL)
            msg_obj = PyObject_Str(value);
        Py_XDECREF(type);
        Py_XDECREF(value);
        Py_XDECREF(tb);
        break;
    }
    case E_LINECONT:
        msg = "unexpected character after line continuation character";
        break;
    default:
        fprintf(stderr, "error=%d\n", err->error);
        msg = "unknown parsing error";
        break;
    }
    /* err->text may not be UTF-8 in case of decoding errors.
       Explicitly convert to an object. */
    if (!err->text) {
        errtext = Py_None;
        Py_INCREF(Py_None);
    } else {
        errtext = PyUnicode_DecodeUTF8(err->text, err->offset,
                                       "replace");
        if (errtext != NULL) {
            Py_ssize_t len = strlen(err->text);
            offset = (int)PyUnicode_GET_LENGTH(errtext);
            if (len != err->offset) {
                Py_DECREF(errtext);
                errtext = PyUnicode_DecodeUTF8(err->text, len,
                                               "replace");
            }
        }
    }
    v = Py_BuildValue("(OiiN)", err->filename,
                      err->lineno, offset, errtext);
    if (v != NULL) {
        if (msg_obj)
            w = Py_BuildValue("(OO)", msg_obj, v);
        else
            w = Py_BuildValue("(sO)", msg, v);
    } else
        w = NULL;
    Py_XDECREF(v);
    PyErr_SetObject(errtype, w);
    Py_XDECREF(w);
cleanup:
    Py_XDECREF(msg_obj);
    if (err->text != NULL) {
        PyObject_FREE(err->text);
        err->text = NULL;
    }
}

// from Python/pythonrun.c
static void
err_free(perrdetail *err)
{
    Py_CLEAR(err->filename);
}

// copy of PyParser_ASTFromStringObject in Python/pythonrun.c
/* Preferred access to parser is through AST. */
static mod_ty
string_object_to_c_ast(const char *s, PyObject *filename, int start,
                             PyCompilerFlags *flags, PyArena *arena)
{
    mod_ty mod;
    PyCompilerFlags localflags;
    perrdetail err;
    int iflags = PARSER_FLAGS(flags);

    node *n = Ta27Parser_ParseStringObject(s, filename,
                                         &_Ta27Parser_Grammar, start, &err,
                                         &iflags);
    if (flags == NULL) {
        localflags.cf_flags = 0;
        flags = &localflags;
    }
    if (n) {
        flags->cf_flags |= iflags & PyCF_MASK;
        mod = Ta27AST_FromNode(n, flags, PyUnicode_AsUTF8(filename), arena);
        Ta27Node_Free(n);
    }
    else {
        err_input(&err);
        mod = NULL;
    }
    err_free(&err);
    return mod;
}

// adapted from Py_CompileStringObject in Python/pythonrun.c
static PyObject *
string_object_to_py_ast(const char *str, PyObject *filename, int start,
                       PyCompilerFlags *flags)
{
    mod_ty mod;
    PyObject *result;
    PyArena *arena = PyArena_New();
    if (arena == NULL)
        return NULL;

    mod = string_object_to_c_ast(str, filename, start, flags, arena);
    if (mod == NULL) {
        PyArena_Free(arena);
        return NULL;
    }

    result = Ta27AST_mod2obj(mod);
    PyArena_Free(arena);
    return result;
}

// adapted from builtin_compile_impl in Python/bltinmodule.c
static PyObject *
ast27_parse_impl(PyObject *source,
                 PyObject *filename, const char *mode)
{
    PyObject *source_copy;
    const char *str;
    int compile_mode = -1;
    PyCompilerFlags cf;
    int start[] = {file_input, eval_input, single_input, func_type_input };
    PyObject *result;

    cf.cf_flags = PyCF_ONLY_AST | PyCF_SOURCE_IS_UTF8;

    if (strcmp(mode, "exec") == 0)
        compile_mode = 0;
    else if (strcmp(mode, "eval") == 0)
        compile_mode = 1;
    else if (strcmp(mode, "single") == 0)
        compile_mode = 2;
    else if (strcmp(mode, "func_type") == 0)
        compile_mode = 3;
    else {
        PyErr_SetString(PyExc_ValueError,
                        "parse() mode must be 'exec', 'eval', 'single', for 'func_type'");
        goto error;
    }

    str = source_as_string(source, "parse", "string or bytes", &cf, &source_copy);
    if (str == NULL)
        goto error;

    result = string_object_to_py_ast(str, filename, start[compile_mode], &cf);
    Py_XDECREF(source_copy);
    goto finally;

error:
    result = NULL;
finally:
    Py_DECREF(filename);
    return result;
}

// adapted from builtin_compile in Python/clinic/bltinmodule.c.h
PyObject *
ast27_parse(PyObject *self, PyObject *args)
{
    PyObject *return_value = NULL;
    PyObject *source;
    PyObject *filename;
    const char *mode;

    if (PyArg_ParseTuple(args, "OO&s:parse", &source, PyUnicode_FSDecoder, &filename, &mode))
        return_value = ast27_parse_impl(source, filename, mode);

    return return_value;
}


================================================
FILE: ast27/Grammar/Grammar
================================================
# Grammar for Python

# Note:  Changing the grammar specified in this file will most likely
#        require corresponding changes in the parser module
#        (../Modules/parsermodule.c).  If you can't make the changes to
#        that module yourself, please co-ordinate the required changes
#        with someone who can; ask around on python-dev for help.  Fred
#        Drake <fdrake@acm.org> will probably be listening there.

# NOTE WELL: You should also follow all the steps listed in PEP 306,
# "How to Change Python's Grammar"

# Start symbols for the grammar:
#       single_input is a single interactive statement;
#       file_input is a module or sequence of commands read from an input file;
#       eval_input is the input for the eval() and input() functions.
#       func_type_input is a PEP 484 Python 2 function type comment
# NB: compound_stmt in single_input is followed by extra NEWLINE!
single_input: NEWLINE | simple_stmt | compound_stmt NEWLINE
file_input: (NEWLINE | stmt)* ENDMARKER
eval_input: testlist NEWLINE* ENDMARKER

decorator: '@' dotted_name [ '(' [arglist] ')' ] NEWLINE
decorators: decorator+
decorated: decorators (classdef | funcdef)
funcdef: 'def' NAME parameters ':' [TYPE_COMMENT] suite
parameters: '(' [varargslist] ')'
varargslist: ((fpdef ['=' test] ',' [TYPE_COMMENT])*
              ('*' NAME [',' [TYPE_COMMENT]  '**' NAME] [TYPE_COMMENT] | '**' NAME [TYPE_COMMENT]) |
              fpdef ['=' test] (',' [TYPE_COMMENT] fpdef ['=' test])* [','] [TYPE_COMMENT])
fpdef: NAME | '(' fplist ')'
fplist: fpdef (',' fpdef)* [',']

stmt: simple_stmt | compound_stmt
simple_stmt: small_stmt (';' small_stmt)* [';'] NEWLINE
small_stmt: (expr_stmt | print_stmt  | del_stmt | pass_stmt | flow_stmt |
             import_stmt | global_stmt | exec_stmt | assert_stmt)
expr_stmt: testlist (augassign (yield_expr|testlist) |
                     ('=' (yield_expr|testlist))* [TYPE_COMMENT])
augassign: ('+=' | '-=' | '*=' | '/=' | '%=' | '&=' | '|=' | '^=' |
            '<<=' | '>>=' | '**=' | '//=')
# For normal assignments, additional restrictions enforced by the interpreter
print_stmt: 'print' ( [ test (',' test)* [','] ] |
                      '>>' test [ (',' test)+ [','] ] )
del_stmt: 'del' exprlist
pass_stmt: 'pass'
flow_stmt: break_stmt | continue_stmt | return_stmt | raise_stmt | yield_stmt
break_stmt: 'break'
continue_stmt: 'continue'
return_stmt: 'return' [testlist]
yield_stmt: yield_expr
raise_stmt: 'raise' [test [',' test [',' test]]]
import_stmt: import_name | import_from
import_name: 'import' dotted_as_names
import_from: ('from' ('.'* dotted_name | '.'+)
              'import' ('*' | '(' import_as_names ')' | import_as_names))
import_as_name: NAME ['as' NAME]
dotted_as_name: dotted_name ['as' NAME]
import_as_names: import_as_name (',' import_as_name)* [',']
dotted_as_names: dotted_as_name (',' dotted_as_name)*
dotted_name: NAME ('.' NAME)*
global_stmt: 'global' NAME (',' NAME)*
exec_stmt: 'exec' expr ['in' test [',' test]]
assert_stmt: 'assert' test [',' test]

compound_stmt: if_stmt | while_stmt | for_stmt | try_stmt | with_stmt | funcdef | classdef | decorated
if_stmt: 'if' test ':' suite ('elif' test ':' suite)* ['else' ':' suite]
while_stmt: 'while' test ':' suite ['else' ':' suite]
for_stmt: 'for' exprlist 'in' testlist ':' [TYPE_COMMENT] suite ['else' ':' suite]
try_stmt: ('try' ':' suite
           ((except_clause ':' suite)+
            ['else' ':' suite]
            ['finally' ':' suite] |
           'finally' ':' suite))
with_stmt: 'with' with_item (',' with_item)*  ':' [TYPE_COMMENT] suite
with_item: test ['as' expr]
# NB compile.c makes sure that the default except clause is last
except_clause: 'except' [test [('as' | ',') test]]
# the TYPE_COMMENT in suites is only parsed for funcdefs, but can't go elsewhere due to ambiguity
suite: simple_stmt | NEWLINE [TYPE_COMMENT NEWLINE] INDENT stmt+ DEDENT

# Backward compatibility cruft to support:
# [ x for x in lambda: True, lambda: False if x() ]
# even while also allowing:
# lambda x: 5 if x else 2
# (But not a mix of the two)
testlist_safe: old_test [(',' old_test)+ [',']]
old_test: or_test | old_lambdef
old_lambdef: 'lambda' [varargslist] ':' old_test

test: or_test ['if' or_test 'else' test] | lambdef
or_test: and_test ('or' and_test)*
and_test: not_test ('and' not_test)*
not_test: 'not' not_test | comparison
comparison: expr (comp_op expr)*
comp_op: '<'|'>'|'=='|'>='|'<='|'<>'|'!='|'in'|'not' 'in'|'is'|'is' 'not'
expr: xor_expr ('|' xor_expr)*
xor_expr: and_expr ('^' and_expr)*
and_expr: shift_expr ('&' shift_expr)*
shift_expr: arith_expr (('<<'|'>>') arith_expr)*
arith_expr: term (('+'|'-') term)*
term: factor (('*'|'/'|'%'|'//') factor)*
factor: ('+'|'-'|'~') factor | power
power: atom trailer* ['**' factor]
atom: ('(' [yield_expr|testlist_comp] ')' |
       '[' [listmaker] ']' |
       '{' [dictorsetmaker] '}' |
       '`' testlist1 '`' |
       NAME | NUMBER | STRING+)
listmaker: test ( list_for | (',' test)* [','] )
testlist_comp: test ( comp_for | (',' test)* [','] )
lambdef: 'lambda' [varargslist] ':' test
trailer: '(' [arglist] ')' | '[' subscriptlist ']' | '.' NAME
subscriptlist: subscript (',' subscript)* [',']
subscript: '.' '.' '.' | test | [test] ':' [test] [sliceop]
sliceop: ':' [test]
exprlist: expr (',' expr)* [',']
testlist: test (',' test)* [',']
dictorsetmaker: ( (test ':' test (comp_for | (',' test ':' test)* [','])) |
                  (test (comp_for | (',' test)* [','])) )

classdef: 'class' NAME ['(' [testlist] ')'] ':' suite

arglist: (argument ',')* (argument [',']
                         |'*' test (',' argument)* [',' '**' test] 
                         |'**' test)
# The reason that keywords are test nodes instead of NAME is that using NAME
# results in an ambiguity. ast.c makes sure it's a NAME.
argument: test [comp_for] | test '=' test

list_iter: list_for | list_if
list_for: 'for' exprlist 'in' testlist_safe [list_iter]
list_if: 'if' old_test [list_iter]

comp_iter: comp_for | comp_if
comp_for: 'for' exprlist 'in' or_test [comp_iter]
comp_if: 'if' old_test [comp_iter]

testlist1: test (',' test)*

# not used in grammar, but may appear in "node" passed from Parser to Compiler
encoding_decl: NAME

yield_expr: 'yield' [testlist]

func_type_input: func_type NEWLINE* ENDMARKER
func_type: '(' [typelist] ')' '->' test
# typelist is a modified typedargslist (see above)
typelist: (test (',' test)* [','
       ['*' [test] (',' test)* [',' '**' test] | '**' test]]
     |  '*' [test] (',' test)* [',' '**' test] | '**' test)


================================================
FILE: ast27/Include/Python-ast.h
================================================
/* File automatically generated by Parser/asdl_c.py. */

#include "../Include/asdl.h"

typedef struct _mod *mod_ty;

typedef struct _stmt *stmt_ty;

typedef struct _expr *expr_ty;

typedef enum _expr_context { Load=1, Store=2, Del=3, AugLoad=4, AugStore=5, Param=6 }
                             expr_context_ty;

typedef struct _slice *slice_ty;

typedef enum _boolop { And=1, Or=2 } boolop_ty;

typedef enum _operator { Add=1, Sub=2, Mult=3, Div=4, Mod=5, Pow=6, LShift=7, RShift=8, BitOr=9,
                         BitXor=10, BitAnd=11, FloorDiv=12 } operator_ty;

typedef enum _unaryop { Invert=1, Not=2, UAdd=3, USub=4 } unaryop_ty;

typedef enum _cmpop { Eq=1, NotEq=2, Lt=3, LtE=4, Gt=5, GtE=6, Is=7, IsNot=8, In=9, NotIn=10 }
                      cmpop_ty;

typedef struct _comprehension *comprehension_ty;

typedef struct _excepthandler *excepthandler_ty;

typedef struct _arguments *arguments_ty;

typedef struct _keyword *keyword_ty;

typedef struct _alias *alias_ty;

typedef struct _type_ignore *type_ignore_ty;


enum _mod_kind {Module_kind=1, Interactive_kind=2, Expression_kind=3, FunctionType_kind=4,
                 Suite_kind=5};
struct _mod {
        enum _mod_kind kind;
        union {
                struct {
                        asdl_seq *body;
                        asdl_seq *type_ignores;
                } Module;
                
                struct {
                        asdl_seq *body;
                } Interactive;
                
                struct {
                        expr_ty body;
                } Expression;
                
                struct {
                        asdl_seq *argtypes;
                        expr_ty returns;
                } FunctionType;
                
                struct {
                        asdl_seq *body;
                } Suite;
                
        } v;
};

enum _stmt_kind {FunctionDef_kind=1, ClassDef_kind=2, Return_kind=3, Delete_kind=4, Assign_kind=5,
                  AugAssign_kind=6, Print_kind=7, For_kind=8, While_kind=9, If_kind=10,
                  With_kind=11, Raise_kind=12, TryExcept_kind=13, TryFinally_kind=14,
                  Assert_kind=15, Import_kind=16, ImportFrom_kind=17, Exec_kind=18, Global_kind=19,
                  Expr_kind=20, Pass_kind=21, Break_kind=22, Continue_kind=23};
struct _stmt {
        enum _stmt_kind kind;
        union {
                struct {
                        identifier name;
                        arguments_ty args;
                        asdl_seq *body;
                        asdl_seq *decorator_list;
                        string type_comment;
                } FunctionDef;
                
                struct {
                        identifier name;
                        asdl_seq *bases;
                        asdl_seq *body;
                        asdl_seq *decorator_list;
                } ClassDef;
                
                struct {
                        expr_ty value;
                } Return;
                
                struct {
                        asdl_seq *targets;
                } Delete;
                
                struct {
                        asdl_seq *targets;
                        expr_ty value;
                        string type_comment;
                } Assign;
                
                struct {
                        expr_ty target;
                        operator_ty op;
                        expr_ty value;
                } AugAssign;
                
                struct {
                        expr_ty dest;
                        asdl_seq *values;
                        bool nl;
                } Print;
                
                struct {
                        expr_ty target;
                        expr_ty iter;
                        asdl_seq *body;
                        asdl_seq *orelse;
                        string type_comment;
                } For;
                
                struct {
                        expr_ty test;
                        asdl_seq *body;
                        asdl_seq *orelse;
                } While;
                
                struct {
                        expr_ty test;
                        asdl_seq *body;
                        asdl_seq *orelse;
                } If;
                
                struct {
                        expr_ty context_expr;
                        expr_ty optional_vars;
                        asdl_seq *body;
                        string type_comment;
                } With;
                
                struct {
                        expr_ty type;
                        expr_ty inst;
                        expr_ty tback;
                } Raise;
                
                struct {
                        asdl_seq *body;
                        asdl_seq *handlers;
                        asdl_seq *orelse;
                } TryExcept;
                
                struct {
                        asdl_seq *body;
                        asdl_seq *finalbody;
                } TryFinally;
                
                struct {
                        expr_ty test;
                        expr_ty msg;
                } Assert;
                
                struct {
                        asdl_seq *names;
                } Import;
                
                struct {
                        identifier module;
                        asdl_seq *names;
                        int level;
                } ImportFrom;
                
                struct {
                        expr_ty body;
                        expr_ty globals;
                        expr_ty locals;
                } Exec;
                
                struct {
                        asdl_seq *names;
                } Global;
                
                struct {
                        expr_ty value;
                } Expr;
                
        } v;
        int lineno;
        int col_offset;
};

enum _expr_kind {BoolOp_kind=1, BinOp_kind=2, UnaryOp_kind=3, Lambda_kind=4, IfExp_kind=5,
                  Dict_kind=6, Set_kind=7, ListComp_kind=8, SetComp_kind=9, DictComp_kind=10,
                  GeneratorExp_kind=11, Yield_kind=12, Compare_kind=13, Call_kind=14, Repr_kind=15,
                  Num_kind=16, Str_kind=17, Attribute_kind=18, Subscript_kind=19, Name_kind=20,
                  List_kind=21, Tuple_kind=22};
struct _expr {
        enum _expr_kind kind;
        union {
                struct {
                        boolop_ty op;
                        asdl_seq *values;
                } BoolOp;
                
                struct {
                        expr_ty left;
                        operator_ty op;
                        expr_ty right;
                } BinOp;
                
                struct {
                        unaryop_ty op;
                        expr_ty operand;
                } UnaryOp;
                
                struct {
                        arguments_ty args;
                        expr_ty body;
                } Lambda;
                
                struct {
                        expr_ty test;
                        expr_ty body;
                        expr_ty orelse;
                } IfExp;
                
                struct {
                        asdl_seq *keys;
                        asdl_seq *values;
                } Dict;
                
                struct {
                        asdl_seq *elts;
                } Set;
                
                struct {
                        expr_ty elt;
                        asdl_seq *generators;
                } ListComp;
                
                struct {
                        expr_ty elt;
                        asdl_seq *generators;
                } SetComp;
                
                struct {
                        expr_ty key;
                        expr_ty value;
                        asdl_seq *generators;
                } DictComp;
                
                struct {
                        expr_ty elt;
                        asdl_seq *generators;
                } GeneratorExp;
                
                struct {
                        expr_ty value;
                } Yield;
                
                struct {
                        expr_ty left;
                        asdl_int_seq *ops;
                        asdl_seq *comparators;
                } Compare;
                
                struct {
                        expr_ty func;
                        asdl_seq *args;
                        asdl_seq *keywords;
                        expr_ty starargs;
                        expr_ty kwargs;
                } Call;
                
                struct {
                        expr_ty value;
                } Repr;
                
                struct {
                        object n;
                } Num;
                
                struct {
                        string s;
                        string kind;
                } Str;
                
                struct {
                        expr_ty value;
                        identifier attr;
                        expr_context_ty ctx;
                } Attribute;
                
                struct {
                        expr_ty value;
                        slice_ty slice;
                        expr_context_ty ctx;
                } Subscript;
                
                struct {
                        identifier id;
                        expr_context_ty ctx;
                } Name;
                
                struct {
                        asdl_seq *elts;
                        expr_context_ty ctx;
                } List;
                
                struct {
                        asdl_seq *elts;
                        expr_context_ty ctx;
                } Tuple;
                
        } v;
        int lineno;
        int col_offset;
};

enum _slice_kind {Ellipsis_kind=1, Slice_kind=2, ExtSlice_kind=3, Index_kind=4};
struct _slice {
        enum _slice_kind kind;
        union {
                struct {
                        expr_ty lower;
                        expr_ty upper;
                        expr_ty step;
                } Slice;
                
                struct {
                        asdl_seq *dims;
                } ExtSlice;
                
                struct {
                        expr_ty value;
                } Index;
                
        } v;
};

struct _comprehension {
        expr_ty target;
        expr_ty iter;
        asdl_seq *ifs;
};

enum _excepthandler_kind {ExceptHandler_kind=1};
struct _excepthandler {
        enum _excepthandler_kind kind;
        union {
                struct {
                        expr_ty type;
                        expr_ty name;
                        asdl_seq *body;
                } ExceptHandler;
                
        } v;
        int lineno;
        int col_offset;
};

struct _arguments {
        asdl_seq *args;
        identifier vararg;
        identifier kwarg;
        asdl_seq *defaults;
        asdl_seq *type_comments;
};

struct _keyword {
        identifier arg;
        expr_ty value;
};

struct _alias {
        identifier name;
        identifier asname;
};

enum _type_ignore_kind {TypeIgnore_kind=1};
struct _type_ignore {
        enum _type_ignore_kind kind;
        union {
                struct {
                        int lineno;
                        string tag;
                } TypeIgnore;
                
        } v;
};


#define Module(a0, a1, a2) _Ta27_Module(a0, a1, a2)
mod_ty _Ta27_Module(asdl_seq * body, asdl_seq * type_ignores, PyArena *arena);
#define Interactive(a0, a1) _Ta27_Interactive(a0, a1)
mod_ty _Ta27_Interactive(asdl_seq * body, PyArena *arena);
#define Expression(a0, a1) _Ta27_Expression(a0, a1)
mod_ty _Ta27_Expression(expr_ty body, PyArena *arena);
#define FunctionType(a0, a1, a2) _Ta27_FunctionType(a0, a1, a2)
mod_ty _Ta27_FunctionType(asdl_seq * argtypes, expr_ty returns, PyArena *arena);
#define Suite(a0, a1) _Ta27_Suite(a0, a1)
mod_ty _Ta27_Suite(asdl_seq * body, PyArena *arena);
#define FunctionDef(a0, a1, a2, a3, a4, a5, a6, a7) _Ta27_FunctionDef(a0, a1, a2, a3, a4, a5, a6, a7)
stmt_ty _Ta27_FunctionDef(identifier name, arguments_ty args, asdl_seq * body, asdl_seq *
                          decorator_list, string type_comment, int lineno, int col_offset, PyArena
                          *arena);
#define ClassDef(a0, a1, a2, a3, a4, a5, a6) _Ta27_ClassDef(a0, a1, a2, a3, a4, a5, a6)
stmt_ty _Ta27_ClassDef(identifier name, asdl_seq * bases, asdl_seq * body, asdl_seq *
                       decorator_list, int lineno, int col_offset, PyArena *arena);
#define Return(a0, a1, a2, a3) _Ta27_Return(a0, a1, a2, a3)
stmt_ty _Ta27_Return(expr_ty value, int lineno, int col_offset, PyArena *arena);
#define Delete(a0, a1, a2, a3) _Ta27_Delete(a0, a1, a2, a3)
stmt_ty _Ta27_Delete(asdl_seq * targets, int lineno, int col_offset, PyArena *arena);
#define Assign(a0, a1, a2, a3, a4, a5) _Ta27_Assign(a0, a1, a2, a3, a4, a5)
stmt_ty _Ta27_Assign(asdl_seq * targets, expr_ty value, string type_comment, int lineno, int
                     col_offset, PyArena *arena);
#define AugAssign(a0, a1, a2, a3, a4, a5) _Ta27_AugAssign(a0, a1, a2, a3, a4, a5)
stmt_ty _Ta27_AugAssign(expr_ty target, operator_ty op, expr_ty value, int lineno, int col_offset,
                        PyArena *arena);
#define Print(a0, a1, a2, a3, a4, a5) _Ta27_Print(a0, a1, a2, a3, a4, a5)
stmt_ty _Ta27_Print(expr_ty dest, asdl_seq * values, bool nl, int lineno, int col_offset, PyArena
                    *arena);
#define For(a0, a1, a2, a3, a4, a5, a6, a7) _Ta27_For(a0, a1, a2, a3, a4, a5, a6, a7)
stmt_ty _Ta27_For(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * orelse, string
                  type_comment, int lineno, int col_offset, PyArena *arena);
#define While(a0, a1, a2, a3, a4, a5) _Ta27_While(a0, a1, a2, a3, a4, a5)
stmt_ty _Ta27_While(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno, int col_offset,
                    PyArena *arena);
#define If(a0, a1, a2, a3, a4, a5) _Ta27_If(a0, a1, a2, a3, a4, a5)
stmt_ty _Ta27_If(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno, int col_offset,
                 PyArena *arena);
#define With(a0, a1, a2, a3, a4, a5, a6) _Ta27_With(a0, a1, a2, a3, a4, a5, a6)
stmt_ty _Ta27_With(expr_ty context_expr, expr_ty optional_vars, asdl_seq * body, string
                   type_comment, int lineno, int col_offset, PyArena *arena);
#define Raise(a0, a1, a2, a3, a4, a5) _Ta27_Raise(a0, a1, a2, a3, a4, a5)
stmt_ty _Ta27_Raise(expr_ty type, expr_ty inst, expr_ty tback, int lineno, int col_offset, PyArena
                    *arena);
#define TryExcept(a0, a1, a2, a3, a4, a5) _Ta27_TryExcept(a0, a1, a2, a3, a4, a5)
stmt_ty _Ta27_TryExcept(asdl_seq * body, asdl_seq * handlers, asdl_seq * orelse, int lineno, int
                        col_offset, PyArena *arena);
#define TryFinally(a0, a1, a2, a3, a4) _Ta27_TryFinally(a0, a1, a2, a3, a4)
stmt_ty _Ta27_TryFinally(asdl_seq * body, asdl_seq * finalbody, int lineno, int col_offset, PyArena
                         *arena);
#define Assert(a0, a1, a2, a3, a4) _Ta27_Assert(a0, a1, a2, a3, a4)
stmt_ty _Ta27_Assert(expr_ty test, expr_ty msg, int lineno, int col_offset, PyArena *arena);
#define Import(a0, a1, a2, a3) _Ta27_Import(a0, a1, a2, a3)
stmt_ty _Ta27_Import(asdl_seq * names, int lineno, int col_offset, PyArena *arena);
#define ImportFrom(a0, a1, a2, a3, a4, a5) _Ta27_ImportFrom(a0, a1, a2, a3, a4, a5)
stmt_ty _Ta27_ImportFrom(identifier module, asdl_seq * names, int level, int lineno, int
                         col_offset, PyArena *arena);
#define Exec(a0, a1, a2, a3, a4, a5) _Ta27_Exec(a0, a1, a2, a3, a4, a5)
stmt_ty _Ta27_Exec(expr_ty body, expr_ty globals, expr_ty locals, int lineno, int col_offset,
                   PyArena *arena);
#define Global(a0, a1, a2, a3) _Ta27_Global(a0, a1, a2, a3)
stmt_ty _Ta27_Global(asdl_seq * names, int lineno, int col_offset, PyArena *arena);
#define Expr(a0, a1, a2, a3) _Ta27_Expr(a0, a1, a2, a3)
stmt_ty _Ta27_Expr(expr_ty value, int lineno, int col_offset, PyArena *arena);
#define Pass(a0, a1, a2) _Ta27_Pass(a0, a1, a2)
stmt_ty _Ta27_Pass(int lineno, int col_offset, PyArena *arena);
#define Break(a0, a1, a2) _Ta27_Break(a0, a1, a2)
stmt_ty _Ta27_Break(int lineno, int col_offset, PyArena *arena);
#define Continue(a0, a1, a2) _Ta27_Continue(a0, a1, a2)
stmt_ty _Ta27_Continue(int lineno, int col_offset, PyArena *arena);
#define BoolOp(a0, a1, a2, a3, a4) _Ta27_BoolOp(a0, a1, a2, a3, a4)
expr_ty _Ta27_BoolOp(boolop_ty op, asdl_seq * values, int lineno, int col_offset, PyArena *arena);
#define BinOp(a0, a1, a2, a3, a4, a5) _Ta27_BinOp(a0, a1, a2, a3, a4, a5)
expr_ty _Ta27_BinOp(expr_ty left, operator_ty op, expr_ty right, int lineno, int col_offset,
                    PyArena *arena);
#define UnaryOp(a0, a1, a2, a3, a4) _Ta27_UnaryOp(a0, a1, a2, a3, a4)
expr_ty _Ta27_UnaryOp(unaryop_ty op, expr_ty operand, int lineno, int col_offset, PyArena *arena);
#define Lambda(a0, a1, a2, a3, a4) _Ta27_Lambda(a0, a1, a2, a3, a4)
expr_ty _Ta27_Lambda(arguments_ty args, expr_ty body, int lineno, int col_offset, PyArena *arena);
#define IfExp(a0, a1, a2, a3, a4, a5) _Ta27_IfExp(a0, a1, a2, a3, a4, a5)
expr_ty _Ta27_IfExp(expr_ty test, expr_ty body, expr_ty orelse, int lineno, int col_offset, PyArena
                    *arena);
#define Dict(a0, a1, a2, a3, a4) _Ta27_Dict(a0, a1, a2, a3, a4)
expr_ty _Ta27_Dict(asdl_seq * keys, asdl_seq * values, int lineno, int col_offset, PyArena *arena);
#define Set(a0, a1, a2, a3) _Ta27_Set(a0, a1, a2, a3)
expr_ty _Ta27_Set(asdl_seq * elts, int lineno, int col_offset, PyArena *arena);
#define ListComp(a0, a1, a2, a3, a4) _Ta27_ListComp(a0, a1, a2, a3, a4)
expr_ty _Ta27_ListComp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset, PyArena
                       *arena);
#define SetComp(a0, a1, a2, a3, a4) _Ta27_SetComp(a0, a1, a2, a3, a4)
expr_ty _Ta27_SetComp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset, PyArena
                      *arena);
#define DictComp(a0, a1, a2, a3, a4, a5) _Ta27_DictComp(a0, a1, a2, a3, a4, a5)
expr_ty _Ta27_DictComp(expr_ty key, expr_ty value, asdl_seq * generators, int lineno, int
                       col_offset, PyArena *arena);
#define GeneratorExp(a0, a1, a2, a3, a4) _Ta27_GeneratorExp(a0, a1, a2, a3, a4)
expr_ty _Ta27_GeneratorExp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset, PyArena
                           *arena);
#define Yield(a0, a1, a2, a3) _Ta27_Yield(a0, a1, a2, a3)
expr_ty _Ta27_Yield(expr_ty value, int lineno, int col_offset, PyArena *arena);
#define Compare(a0, a1, a2, a3, a4, a5) _Ta27_Compare(a0, a1, a2, a3, a4, a5)
expr_ty _Ta27_Compare(expr_ty left, asdl_int_seq * ops, asdl_seq * comparators, int lineno, int
                      col_offset, PyArena *arena);
#define Call(a0, a1, a2, a3, a4, a5, a6, a7) _Ta27_Call(a0, a1, a2, a3, a4, a5, a6, a7)
expr_ty _Ta27_Call(expr_ty func, asdl_seq * args, asdl_seq * keywords, expr_ty starargs, expr_ty
                   kwargs, int lineno, int col_offset, PyArena *arena);
#define Repr(a0, a1, a2, a3) _Ta27_Repr(a0, a1, a2, a3)
expr_ty _Ta27_Repr(expr_ty value, int lineno, int col_offset, PyArena *arena);
#define Num(a0, a1, a2, a3) _Ta27_Num(a0, a1, a2, a3)
expr_ty _Ta27_Num(object n, int lineno, int col_offset, PyArena *arena);
#define Str(a0, a1, a2, a3, a4) _Ta27_Str(a0, a1, a2, a3, a4)
expr_ty _Ta27_Str(string s, string kind, int lineno, int col_offset, PyArena *arena);
#define Attribute(a0, a1, a2, a3, a4, a5) _Ta27_Attribute(a0, a1, a2, a3, a4, a5)
expr_ty _Ta27_Attribute(expr_ty value, identifier attr, expr_context_ty ctx, int lineno, int
                        col_offset, PyArena *arena);
#define Subscript(a0, a1, a2, a3, a4, a5) _Ta27_Subscript(a0, a1, a2, a3, a4, a5)
expr_ty _Ta27_Subscript(expr_ty value, slice_ty slice, expr_context_ty ctx, int lineno, int
                        col_offset, PyArena *arena);
#define Name(a0, a1, a2, a3, a4) _Ta27_Name(a0, a1, a2, a3, a4)
expr_ty _Ta27_Name(identifier id, expr_context_ty ctx, int lineno, int col_offset, PyArena *arena);
#define List(a0, a1, a2, a3, a4) _Ta27_List(a0, a1, a2, a3, a4)
expr_ty _Ta27_List(asdl_seq * elts, expr_context_ty ctx, int lineno, int col_offset, PyArena
                   *arena);
#define Tuple(a0, a1, a2, a3, a4) _Ta27_Tuple(a0, a1, a2, a3, a4)
expr_ty _Ta27_Tuple(asdl_seq * elts, expr_context_ty ctx, int lineno, int col_offset, PyArena
                    *arena);
#define Ellipsis(a0) _Ta27_Ellipsis(a0)
slice_ty _Ta27_Ellipsis(PyArena *arena);
#define Slice(a0, a1, a2, a3) _Ta27_Slice(a0, a1, a2, a3)
slice_ty _Ta27_Slice(expr_ty lower, expr_ty upper, expr_ty step, PyArena *arena);
#define ExtSlice(a0, a1) _Ta27_ExtSlice(a0, a1)
slice_ty _Ta27_ExtSlice(asdl_seq * dims, PyArena *arena);
#define Index(a0, a1) _Ta27_Index(a0, a1)
slice_ty _Ta27_Index(expr_ty value, PyArena *arena);
#define comprehension(a0, a1, a2, a3) _Ta27_comprehension(a0, a1, a2, a3)
comprehension_ty _Ta27_comprehension(expr_ty target, expr_ty iter, asdl_seq * ifs, PyArena *arena);
#define ExceptHandler(a0, a1, a2, a3, a4, a5) _Ta27_ExceptHandler(a0, a1, a2, a3, a4, a5)
excepthandler_ty _Ta27_ExceptHandler(expr_ty type, expr_ty name, asdl_seq * body, int lineno, int
                                     col_offset, PyArena *arena);
#define arguments(a0, a1, a2, a3, a4, a5) _Ta27_arguments(a0, a1, a2, a3, a4, a5)
arguments_ty _Ta27_arguments(asdl_seq * args, identifier vararg, identifier kwarg, asdl_seq *
                             defaults, asdl_seq * type_comments, PyArena *arena);
#define keyword(a0, a1, a2) _Ta27_keyword(a0, a1, a2)
keyword_ty _Ta27_keyword(identifier arg, expr_ty value, PyArena *arena);
#define alias(a0, a1, a2) _Ta27_alias(a0, a1, a2)
alias_ty _Ta27_alias(identifier name, identifier asname, PyArena *arena);
#define TypeIgnore(a0, a1, a2) _Ta27_TypeIgnore(a0, a1, a2)
type_ignore_ty _Ta27_TypeIgnore(int lineno, string tag, PyArena *arena);

PyObject* Ta27AST_mod2obj(mod_ty t);
mod_ty Ta27AST_obj2mod(PyObject* ast, PyArena* arena, int mode);
int Ta27AST_Check(PyObject* obj);


================================================
FILE: ast27/Include/asdl.h
================================================
#ifndef Ta27_ASDL_H
#define Ta27_ASDL_H

#include "../Include/pyarena.h"

typedef PyObject * identifier;
typedef PyObject * string;
typedef PyObject * object;

#ifndef __cplusplus
#ifndef __bool_true_false_are_defined
typedef enum {false, true} bool;
#endif
#endif

/* It would be nice if the code generated by asdl_c.py was completely
   independent of Python, but it is a goal the requires too much work
   at this stage.  So, for example, I'll represent identifiers as
   interned Python strings.
*/

/* XXX A sequence should be typed so that its use can be typechecked. */

typedef struct {
    int size;
    void *elements[1];
} asdl_seq;

typedef struct {
    int size;
    int elements[1];
} asdl_int_seq;

#define asdl_seq_new _Ta27_asdl_seq_new
#define asdl_int_seq_new _Ta27_asdl_int_seq_new
asdl_seq *asdl_seq_new(Py_ssize_t size, PyArena *arena);
asdl_int_seq *asdl_int_seq_new(Py_ssize_t size, PyArena *arena);

#define asdl_seq_GET(S, I) (S)->elements[(I)]
#define asdl_seq_LEN(S) ((S) == NULL ? 0 : (S)->size)
#ifdef Py_DEBUG
#define asdl_seq_SET(S, I, V) { \
        int _asdl_i = (I); \
        assert((S) && _asdl_i < (S)->size); \
        (S)->elements[_asdl_i] = (V); \
}
#else
#define asdl_seq_SET(S, I, V) (S)->elements[I] = (V)
#endif

#endif /* !Ta27_ASDL_H */


================================================
FILE: ast27/Include/ast.h
================================================
#ifndef Ta27_AST_H
#define Ta27_AST_H
#ifdef __cplusplus
extern "C" {
#endif

mod_ty Ta27AST_FromNode(const node *, PyCompilerFlags *flags,
				  const char *, PyArena *);

#ifdef __cplusplus
}
#endif
#endif /* !Ta27_AST_H */


================================================
FILE: ast27/Include/bitset.h
================================================

#ifndef Ta27_BITSET_H
#define Ta27_BITSET_H
#ifdef __cplusplus
extern "C" {
#endif

/* Bitset interface */

#define BYTE		char

typedef BYTE *bitset;

bitset newbitset(int nbits);
void delbitset(bitset bs);
#define testbit(ss, ibit) (((ss)[BIT2BYTE(ibit)] & BIT2MASK(ibit)) != 0)
int addbit(bitset bs, int ibit); /* Returns 0 if already set */
int samebitset(bitset bs1, bitset bs2, int nbits);
void mergebitset(bitset bs1, bitset bs2, int nbits);

#define BITSPERBYTE	(8*sizeof(BYTE))
#define NBYTES(nbits)	(((nbits) + BITSPERBYTE - 1) / BITSPERBYTE)

#define BIT2BYTE(ibit)	((ibit) / BITSPERBYTE)
#define BIT2SHIFT(ibit)	((ibit) % BITSPERBYTE)
#define BIT2MASK(ibit)	(1 << BIT2SHIFT(ibit))
#define BYTE2BIT(ibyte)	((ibyte) * BITSPERBYTE)

#ifdef __cplusplus
}
#endif
#endif /* !Ta27_BITSET_H */


================================================
FILE: ast27/Include/compile.h
================================================

#ifndef Ta27_COMPILE_H
#define Ta27_COMPILE_H

#include "Python.h"

#ifdef __cplusplus
extern "C" {
#endif

/* Public interface */
PyAPI_FUNC(PyFutureFeatures *) PyFuture_FromAST(struct _mod *, const char *);


#ifdef __cplusplus
}
#endif
#endif /* !Ta27_COMPILE_H */


================================================
FILE: ast27/Include/errcode.h
================================================
#ifndef Ta27_ERRCODE_H
#define Ta27_ERRCODE_H
#ifdef __cplusplus
extern "C" {
#endif


/* Error codes passed around between file input, tokenizer, parser and
   interpreter.  This is necessary so we can turn them into Python
   exceptions at a higher level.  Note that some errors have a
   slightly different meaning when passed from the tokenizer to the
   parser than when passed from the parser to the interpreter; e.g.
   the parser only returns E_EOF when it hits EOF immediately, and it
   never returns E_OK. */

#define E_OK		10	/* No error */
#define E_EOF		11	/* End Of File */
#define E_INTR		12	/* Interrupted */
#define E_TOKEN		13	/* Bad token */
#define E_SYNTAX	14	/* Syntax error */
#define E_NOMEM		15	/* Ran out of memory */
#define E_DONE		16	/* Parsing complete */
#define E_ERROR		17	/* Execution error */
#define E_TABSPACE	18	/* Inconsistent mixing of tabs and spaces */
#define E_OVERFLOW      19	/* Node had too many children */
#define E_TOODEEP	20	/* Too many indentation levels */
#define E_DEDENT	21	/* No matching outer block for dedent */
#define E_DECODE	22	/* Error in decoding into Unicode */
#define E_EOFS		23	/* EOF in triple-quoted string */
#define E_EOLS		24	/* EOL in single-quoted string */
#define E_LINECONT	25	/* Unexpected characters after a line continuation */

#ifdef __cplusplus
}
#endif
#endif /* !Ta27_ERRCODE_H */


================================================
FILE: ast27/Include/graminit.h
================================================
/* Generated by Parser/pgen */

#define single_input 256
#define file_input 257
#define eval_input 258
#define decorator 259
#define decorators 260
#define decorated 261
#define funcdef 262
#define parameters 263
#define varargslist 264
#define fpdef 265
#define fplist 266
#define stmt 267
#define simple_stmt 268
#define small_stmt 269
#define expr_stmt 270
#define augassign 271
#define print_stmt 272
#define del_stmt 273
#define pass_stmt 274
#define flow_stmt 275
#define break_stmt 276
#define continue_stmt 277
#define return_stmt 278
#define yield_stmt 279
#define raise_stmt 280
#define import_stmt 281
#define import_name 282
#define import_from 283
#define import_as_name 284
#define dotted_as_name 285
#define import_as_names 286
#define dotted_as_names 287
#define dotted_name 288
#define global_stmt 289
#define exec_stmt 290
#define assert_stmt 291
#define compound_stmt 292
#define if_stmt 293
#define while_stmt 294
#define for_stmt 295
#define try_stmt 296
#define with_stmt 297
#define with_item 298
#define except_clause 299
#define suite 300
#define testlist_safe 301
#define old_test 302
#define old_lambdef 303
#define test 304
#define or_test 305
#define and_test 306
#define not_test 307
#define comparison 308
#define comp_op 309
#define expr 310
#define xor_expr 311
#define and_expr 312
#define shift_expr 313
#define arith_expr 314
#define term 315
#define factor 316
#define power 317
#define atom 318
#define listmaker 319
#define testlist_comp 320
#define lambdef 321
#define trailer 322
#define subscriptlist 323
#define subscript 324
#define sliceop 325
#define exprlist 326
#define testlist 327
#define dictorsetmaker 328
#define classdef 329
#define arglist 330
#define argument 331
#define list_iter 332
#define list_for 333
#define list_if 334
#define comp_iter 335
#define comp_for 336
#define comp_if 337
#define testlist1 338
#define encoding_decl 339
#define yield_expr 340
#define func_type_input 341
#define func_type 342
#define typelist 343


================================================
FILE: ast27/Include/grammar.h
================================================

/* Grammar interface */

#ifndef Ta27_GRAMMAR_H
#define Ta27_GRAMMAR_H
#ifdef __cplusplus
extern "C" {
#endif

#include "../Include/bitset.h"

/* A label of an arc */

typedef struct {
    int		 lb_type;
    char	*lb_str;
} label;

#define EMPTY 0		/* Label number 0 is by definition the empty label */

/* A list of labels */

typedef struct {
    int		 ll_nlabels;
    label	*ll_label;
} labellist;

/* An arc from one state to another */

typedef struct {
    short	a_lbl;		/* Label of this arc */
    short	a_arrow;	/* State where this arc goes to */
} arc;

/* A state in a DFA */

typedef struct {
    int		 s_narcs;
    arc		*s_arc;		/* Array of arcs */
	
    /* Optional accelerators */
    int		 s_lower;	/* Lowest label index */
    int		 s_upper;	/* Highest label index */
    int		*s_accel;	/* Accelerator */
    int		 s_accept;	/* Nonzero for accepting state */
} state;

/* A DFA */

typedef struct {
    int		 d_type;	/* Non-terminal this represents */
    char	*d_name;	/* For printing */
    int		 d_initial;	/* Initial state */
    int		 d_nstates;
    state	*d_state;	/* Array of states */
    bitset	 d_first;
} dfa;

/* A grammar */

typedef struct {
    int		 g_ndfas;
    dfa		*g_dfa;		/* Array of DFAs */
    labellist	 g_ll;
    int		 g_start;	/* Start symbol of the grammar */
    int		 g_accel;	/* Set if accelerators present */
} grammar;

/* FUNCTIONS */

grammar *newgrammar(int start);
dfa *adddfa(grammar *g, int type, char *name);
int addstate(dfa *d);
void addarc(dfa *d, int from, int to, int lbl);
dfa *Ta27Grammar_FindDFA(grammar *g, int type);

int addlabel(labellist *ll, int type, char *str);
int findlabel(labellist *ll, int type, char *str);
char *Ta27Grammar_LabelRepr(label *lb);
void translatelabels(grammar *g);

void addfirstsets(grammar *g);

void Ta27Grammar_AddAccelerators(grammar *g);
void Ta27Grammar_RemoveAccelerators(grammar *);

void printgrammar(grammar *g, FILE *fp);
void printnonterminals(grammar *g, FILE *fp);

#ifdef __cplusplus
}
#endif
#endif /* !Ta27_GRAMMAR_H */


================================================
FILE: ast27/Include/node.h
================================================

/* Parse tree node interface */

#ifndef Ta27_NODE_H
#define Ta27_NODE_H
#ifdef __cplusplus
extern "C" {
#endif

typedef struct _node {
    short		n_type;
    char		*n_str;
    int			n_lineno;
    int			n_col_offset;
    int			n_nchildren;
    struct _node	*n_child;
} node;

node *Ta27Node_New(int type);
int Ta27Node_AddChild(node *n, int type,
                      char *str, int lineno, int col_offset);
void Ta27Node_Free(node *n);
Py_ssize_t _Ta27Node_SizeOf(node *n);

/* Node access functions */
#define NCH(n)		((n)->n_nchildren)
	
#define CHILD(n, i)	(&(n)->n_child[i])
#define RCHILD(n, i)	(CHILD(n, NCH(n) + i))
#define TYPE(n)		((n)->n_type)
#define STR(n)		((n)->n_str)

/* Assert that the type of a node is what we expect */
#define REQ(n, type) assert(TYPE(n) == (type))

PyAPI_FUNC(void) PyNode_ListTree(node *);

#ifdef __cplusplus
}
#endif
#endif /* !Ta27_NODE_H */


================================================
FILE: ast27/Include/parsetok.h
================================================

/* Parser-tokenizer link interface */

#ifndef Ta27_PARSETOK_H
#define Ta27_PARSETOK_H
#ifdef __cplusplus
extern "C" {
#endif

typedef struct {
    int error;
    PyObject *filename;
    int lineno;
    int offset;
    char *text;
    int token;
    int expected;
} perrdetail;

#if 0
#define PyPARSE_YIELD_IS_KEYWORD	0x0001
#endif

#define PyPARSE_DONT_IMPLY_DEDENT	0x0002

#if 0
#define PyPARSE_WITH_IS_KEYWORD		0x0003
#endif

#define PyPARSE_PRINT_IS_FUNCTION       0x0004
#define PyPARSE_UNICODE_LITERALS        0x0008

#define PyPARSE_IGNORE_COOKIE 0x0010


node *Ta27Parser_ParseString(const char *, grammar *, int,
                             perrdetail *);
node *Ta27Parser_ParseFile (FILE *, const char *, grammar *, int,
                            char *, char *, perrdetail *);

node *Ta27Parser_ParseStringFlags(const char *, grammar *, int,
                                  perrdetail *, int);
node *Ta27Parser_ParseFileFlags(FILE *, const char *, grammar *,
       			 int, char *, char *,
       			 perrdetail *, int);
node *Ta27Parser_ParseFileFlagsEx(FILE *, const char *, grammar *,
       			 int, char *, char *,
       			 perrdetail *, int *);

node *Ta27Parser_ParseStringFlagsFilename(const char *,
       		      const char *,
       		      grammar *, int,
                perrdetail *, int);
node *Ta27Parser_ParseStringFlagsFilenameEx(const char *,
					      const char *,
					      grammar *, int,
                                              perrdetail *, int *);

node *Ta27Parser_ParseStringObject(
    const char *s,
    PyObject *filename,
    grammar *g,
    int start,
    perrdetail *err_ret,
    int *flags);

/* Note that he following function is defined in pythonrun.c not parsetok.c. */
PyAPI_FUNC(void) PyParser_SetError(perrdetail *);

#ifdef __cplusplus
}
#endif
#endif /* !Ta27_PARSETOK_H */


================================================
FILE: ast27/Include/pgenheaders.h
================================================
#ifndef DUMMY_Py_PGENHEADERS_H
#define DUMMY_Py_PGENHEADERS_H

/* pgenheaders.h is included by a bunch of files but nothing in it is
 * used except for the Python.h import, and it was removed in Python
 * 3.8. Since some of those files are generated we provide a dummy
 * pgenheaders.h. */
#include "Python.h"

#endif /* !DUMMY_Py_PGENHEADERS_H */


================================================
FILE: ast27/Include/pyarena.h
================================================
/* An arena-like memory interface for the compiler.
 */

#ifndef Ta27_PYARENA_H
#define Ta27_PYARENA_H

#if PY_MINOR_VERSION >= 10
#include "../Include/pycore_pyarena.h"

#define PyArena_New _PyArena_New
#define PyArena_Free _PyArena_Free
#define PyArena_Malloc _PyArena_Malloc
#define PyArena_AddPyObject _PyArena_AddPyObject
#endif

#endif /* !Ta27_PYARENA_H */


================================================
FILE: ast27/Include/pycore_pyarena.h
================================================
/* An arena-like memory interface for the compiler.
 */

#ifndef Ta27_INTERNAL_PYARENA_H
#define Ta27_INTERNAL_PYARENA_H
#ifdef __cplusplus
extern "C" {
#endif

typedef struct _arena PyArena;

/* _PyArena_New() and _PyArena_Free() create a new arena and free it,
   respectively.  Once an arena has been created, it can be used
   to allocate memory via _PyArena_Malloc().  Pointers to PyObject can
   also be registered with the arena via _PyArena_AddPyObject(), and the
   arena will ensure that the PyObjects stay alive at least until
   _PyArena_Free() is called.  When an arena is freed, all the memory it
   allocated is freed, the arena releases internal references to registered
   PyObject*, and none of its pointers are valid.
   XXX (tim) What does "none of its pointers are valid" mean?  Does it
   XXX mean that pointers previously obtained via _PyArena_Malloc() are
   XXX no longer valid?  (That's clearly true, but not sure that's what
   XXX the text is trying to say.)

   _PyArena_New() returns an arena pointer.  On error, it
   returns a negative number and sets an exception.
   XXX (tim):  Not true.  On error, _PyArena_New() actually returns NULL,
   XXX and looks like it may or may not set an exception (e.g., if the
   XXX internal PyList_New(0) returns NULL, _PyArena_New() passes that on
   XXX and an exception is set; OTOH, if the internal
   XXX block_new(DEFAULT_BLOCK_SIZE) returns NULL, that's passed on but
   XXX an exception is not set in that case).
*/
PyAPI_FUNC(PyArena*) _PyArena_New(void);
PyAPI_FUNC(void) _PyArena_Free(PyArena *);

/* Mostly like malloc(), return the address of a block of memory spanning
 * `size` bytes, or return NULL (without setting an exception) if enough
 * new memory can't be obtained.  Unlike malloc(0), _PyArena_Malloc() with
 * size=0 does not guarantee to return a unique pointer (the pointer
 * returned may equal one or more other pointers obtained from
 * _PyArena_Malloc()).
 * Note that pointers obtained via _PyArena_Malloc() must never be passed to
 * the system free() or realloc(), or to any of Python's similar memory-
 * management functions.  _PyArena_Malloc()-obtained pointers remain valid
 * until _PyArena_Free(ar) is called, at which point all pointers obtained
 * from the arena `ar` become invalid simultaneously.
 */
PyAPI_FUNC(void*) _PyArena_Malloc(PyArena *, size_t size);

/* This routine isn't a proper arena allocation routine.  It takes
 * a PyObject* and records it so that it can be DECREFed when the
 * arena is freed.
 */
PyAPI_FUNC(int) _PyArena_AddPyObject(PyArena *, PyObject *);

#ifdef __cplusplus
}
#endif
#endif /* !Ta27_INTERNAL_PYARENA_H */


================================================
FILE: ast27/Include/token.h
================================================

/* Token types */

#ifndef Ta27_TOKEN_H
#define Ta27_TOKEN_H
#ifdef __cplusplus
extern "C" {
#endif

#undef TILDE   /* Prevent clash of our definition with system macro. Ex AIX, ioctl.h */

#define ENDMARKER	0
#define NAME		1
#define NUMBER		2
#define STRING		3
#define NEWLINE		4
#define INDENT		5
#define DEDENT		6
#define LPAR		7
#define RPAR		8
#define LSQB		9
#define RSQB		10
#define COLON		11
#define COMMA		12
#define SEMI		13
#define PLUS		14
#define MINUS		15
#define STAR		16
#define SLASH		17
#define VBAR		18
#define AMPER		19
#define LESS		20
#define GREATER		21
#define EQUAL		22
#define DOT		23
#define PERCENT		24
#define BACKQUOTE	25
#define LBRACE		26
#define RBRACE		27
#define EQEQUAL		28
#define NOTEQUAL	29
#define LESSEQUAL	30
#define GREATEREQUAL	31
#define TILDE		32
#define CIRCUMFLEX	33
#define LEFTSHIFT	34
#define RIGHTSHIFT	35
#define DOUBLESTAR	36
#define PLUSEQUAL	37
#define MINEQUAL	38
#define STAREQUAL	39
#define SLASHEQUAL	40
#define PERCENTEQUAL	41
#define AMPEREQUAL	42
#define VBAREQUAL	43
#define CIRCUMFLEXEQUAL	44
#define LEFTSHIFTEQUAL	45
#define RIGHTSHIFTEQUAL	46
#define DOUBLESTAREQUAL	47
#define DOUBLESLASH	48
#define DOUBLESLASHEQUAL 49
#define AT              50	
/* Don't forget to update the table _Ta27Parser_TokenNames in tokenizer.c! */
#define OP		51
#define RARROW          52
#define TYPE_IGNORE	53
#define TYPE_COMMENT	54
#define ERRORTOKEN	55
#define N_TOKENS	56

/* Special definitions for cooperation with parser */

#define NT_OFFSET		256

#define ISTERMINAL(x)		((x) < NT_OFFSET)
#define ISNONTERMINAL(x)	((x) >= NT_OFFSET)
#define ISEOF(x)		((x) == ENDMARKER)


extern char *_Ta27Parser_TokenNames[]; /* Token names */
int Ta27Token_OneChar(int);
int Ta27Token_TwoChars(int, int);
int Ta27Token_ThreeChars(int, int, int);

#ifdef __cplusplus
}
#endif
#endif /* !Ta27_TOKEN_H */


================================================
FILE: ast27/Parser/Python.asdl
================================================
-- ASDL's five builtin types are identifier, int, string, object, bool

module Python version "$Revision$"
{
	mod = Module(stmt* body, type_ignore *type_ignores)
	    | Interactive(stmt* body)
	    | Expression(expr body)
      | FunctionType(expr* argtypes, expr returns)

	    -- not really an actual node but useful in Jython's typesystem.
	    | Suite(stmt* body)

	stmt = FunctionDef(identifier name, arguments args,
                            stmt* body, expr* decorator_list, string? type_comment)
	      | ClassDef(identifier name, expr* bases, stmt* body, expr* decorator_list)
	      | Return(expr? value)

	      | Delete(expr* targets)
	      | Assign(expr* targets, expr value, string? type_comment)
	      | AugAssign(expr target, operator op, expr value)

	      -- not sure if bool is allowed, can always use int
 	      | Print(expr? dest, expr* values, bool nl)

	      -- use 'orelse' because else is a keyword in target languages
	      | For(expr target, expr iter, stmt* body, stmt* orelse, string? type_comment)
	      | While(expr test, stmt* body, stmt* orelse)
	      | If(expr test, stmt* body, stmt* orelse)
	      | With(expr context_expr, expr? optional_vars, stmt* body, string? type_comment)

	      -- 'type' is a bad name
	      | Raise(expr? type, expr? inst, expr? tback)
	      | TryExcept(stmt* body, excepthandler* handlers, stmt* orelse)
	      | TryFinally(stmt* body, stmt* finalbody)
	      | Assert(expr test, expr? msg)

	      | Import(alias* names)
	      | ImportFrom(identifier? module, alias* names, int? level)

	      -- Doesn't capture requirement that locals must be
	      -- defined if globals is
	      -- still supports use as a function!
	      | Exec(expr body, expr? globals, expr? locals)

	      | Global(identifier* names)
	      | Expr(expr value)
	      | Pass | Break | Continue

	      -- XXX Jython will be different
	      -- col_offset is the byte offset in the utf8 string the parser uses
	      attributes (int lineno, int col_offset)

	      -- BoolOp() can use left & right?
	expr = BoolOp(boolop op, expr* values)
	     | BinOp(expr left, operator op, expr right)
	     | UnaryOp(unaryop op, expr operand)
	     | Lambda(arguments args, expr body)
	     | IfExp(expr test, expr body, expr orelse)
	     | Dict(expr* keys, expr* values)
	     | Set(expr* elts)
	     | ListComp(expr elt, comprehension* generators)
	     | SetComp(expr elt, comprehension* generators)
	     | DictComp(expr key, expr value, comprehension* generators)
	     | GeneratorExp(expr elt, comprehension* generators)
	     -- the grammar constrains where yield expressions can occur
	     | Yield(expr? value)
	     -- need sequences for compare to distinguish between
	     -- x < 4 < 3 and (x < 4) < 3
	     | Compare(expr left, cmpop* ops, expr* comparators)
	     | Call(expr func, expr* args, keyword* keywords,
			 expr? starargs, expr? kwargs)
	     | Repr(expr value)
	     | Num(object n) -- a number as a PyObject.
	     | Str(string s, string kind)
	     -- other literals? bools?

	     -- the following expression can appear in assignment context
	     | Attribute(expr value, identifier attr, expr_context ctx)
	     | Subscript(expr value, slice slice, expr_context ctx)
	     | Name(identifier id, expr_context ctx)
	     | List(expr* elts, expr_context ctx)
	     | Tuple(expr* elts, expr_context ctx)

	      -- col_offset is the byte offset in the utf8 string the parser uses
	      attributes (int lineno, int col_offset)

	expr_context = Load | Store | Del | AugLoad | AugStore | Param

	slice = Ellipsis | Slice(expr? lower, expr? upper, expr? step)
	      | ExtSlice(slice* dims)
	      | Index(expr value)

	boolop = And | Or

	operator = Add | Sub | Mult | Div | Mod | Pow | LShift
                 | RShift | BitOr | BitXor | BitAnd | FloorDiv

	unaryop = Invert | Not | UAdd | USub

	cmpop = Eq | NotEq | Lt | LtE | Gt | GtE | Is | IsNot | In | NotIn

	comprehension = (expr target, expr iter, expr* ifs)

	-- not sure what to call the first argument for raise and except
	excepthandler = ExceptHandler(expr? type, expr? name, stmt* body)
	                attributes (int lineno, int col_offset)

	-- type_comments is used to support the per-argument type comment syntax.
	-- It is either an empty list or a list with length equal to the number of
	-- args (including varargs and kwargs, if present) and with members set to the
	-- string of each arg's type comment, if present, or None otherwise.
	arguments = (expr* args, identifier? vararg,
		     identifier? kwarg, expr* defaults, string* type_comments)

        -- keyword arguments supplied to call
        keyword = (identifier arg, expr value)

        -- import name with optional 'as' alias.
        alias = (identifier name, identifier? asname)

  type_ignore = TypeIgnore(int lineno, string tag)
}


================================================
FILE: ast27/Parser/acceler.c
================================================

/* Parser accelerator module */

/* The parser as originally conceived had disappointing performance.
   This module does some precomputation that speeds up the selection
   of a DFA based upon a token, turning a search through an array
   into a simple indexing operation.  The parser now cannot work
   without the accelerators installed.  Note that the accelerators
   are installed dynamically when the parser is initialized, they
   are not part of the static data structure written on graminit.[ch]
   by the parser generator. */

#include "../Include/pgenheaders.h"
#include "../Include/grammar.h"
#include "../Include/node.h"
#include "../Include/token.h"
#include "parser.h"

/* Forward references */
static void fixdfa(grammar *, dfa *);
static void fixstate(grammar *, state *);

void
Ta27Grammar_AddAccelerators(grammar *g)
{
    dfa *d;
    int i;
    d = g->g_dfa;
    for (i = g->g_ndfas; --i >= 0; d++)
        fixdfa(g, d);
    g->g_accel = 1;
}

void
Ta27Grammar_RemoveAccelerators(grammar *g)
{
    dfa *d;
    int i;
    g->g_accel = 0;
    d = g->g_dfa;
    for (i = g->g_ndfas; --i >= 0; d++) {
        state *s;
        int j;
        s = d->d_state;
        for (j = 0; j < d->d_nstates; j++, s++) {
            if (s->s_accel)
                PyObject_FREE(s->s_accel);
            s->s_accel = NULL;
        }
    }
}

static void
fixdfa(grammar *g, dfa *d)
{
    state *s;
    int j;
    s = d->d_state;
    for (j = 0; j < d->d_nstates; j++, s++)
        fixstate(g, s);
}

static void
fixstate(grammar *g, state *s)
{
    arc *a;
    int k;
    int *accel;
    int nl = g->g_ll.ll_nlabels;
    s->s_accept = 0;
    accel = (int *) PyObject_MALLOC(nl * sizeof(int));
    if (accel == NULL) {
        fprintf(stderr, "no mem to build parser accelerators\n");
        exit(1);
    }
    for (k = 0; k < nl; k++)
        accel[k] = -1;
    a = s->s_arc;
    for (k = s->s_narcs; --k >= 0; a++) {
        int lbl = a->a_lbl;
        label *l = &g->g_ll.ll_label[lbl];
        int type = l->lb_type;
        if (a->a_arrow >= (1 << 7)) {
            printf("XXX too many states!\n");
            continue;
        }
        if (ISNONTERMINAL(type)) {
            dfa *d1 = Ta27Grammar_FindDFA(g, type);
            int ibit;
            if (type - NT_OFFSET >= (1 << 7)) {
                printf("XXX too high nonterminal number!\n");
                continue;
            }
            for (ibit = 0; ibit < g->g_ll.ll_nlabels; ibit++) {
                if (testbit(d1->d_first, ibit)) {
                    if (accel[ibit] != -1)
                        printf("XXX ambiguity!\n");
                    accel[ibit] = a->a_arrow | (1 << 7) |
                        ((type - NT_OFFSET) << 8);
                }
            }
        }
        else if (lbl == EMPTY)
            s->s_accept = 1;
        else if (lbl >= 0 && lbl < nl)
            accel[lbl] = a->a_arrow;
    }
    while (nl > 0 && accel[nl-1] == -1)
        nl--;
    for (k = 0; k < nl && accel[k] == -1;)
        k++;
    if (k < nl) {
        int i;
        s->s_accel = (int *) PyObject_MALLOC((nl-k) * sizeof(int));
        if (s->s_accel == NULL) {
            fprintf(stderr, "no mem to add parser accelerators\n");
            exit(1);
        }
        s->s_lower = k;
        s->s_upper = nl;
        for (i = 0; k < nl; i++, k++)
            s->s_accel[i] = accel[k];
    }
    PyObject_FREE(accel);
}


================================================
FILE: ast27/Parser/asdl.py
================================================
"""An implementation of the Zephyr Abstract Syntax Definition Language.

See http://asdl.sourceforge.net/ and
http://www.cs.princeton.edu/research/techreps/TR-554-97

Only supports top level module decl, not view.  I'm guessing that view
is intended to support the browser and I'm not interested in the
browser.

Changes for Python: Add support for module versions
"""

import os
import traceback

import spark

class Token(object):
    # spark seems to dispatch in the parser based on a token's
    # type attribute
    def __init__(self, type, lineno):
        self.type = type
        self.lineno = lineno

    def __str__(self):
        return self.type

    def __repr__(self):
        return str(self)

class Id(Token):
    def __init__(self, value, lineno):
        self.type = 'Id'
        self.value = value
        self.lineno = lineno

    def __str__(self):
        return self.value

class String(Token):
    def __init__(self, value, lineno):
        self.type = 'String'
        self.value = value
        self.lineno = lineno

class ASDLSyntaxError(Exception):

    def __init__(self, lineno, token=None, msg=None):
        self.lineno = lineno
        self.token = token
        self.msg = msg

    def __str__(self):
        if self.msg is None:
            return "Error at '%s', line %d" % (self.token, self.lineno)
        else:
            return "%s, line %d" % (self.msg, self.lineno)

class ASDLScanner(spark.GenericScanner, object):

    def tokenize(self, input):
        self.rv = []
        self.lineno = 1
        super(ASDLScanner, self).tokenize(input)
        return self.rv

    def t_id(self, s):
        r"[\w\.]+"
        # XXX doesn't distinguish upper vs. lower, which is
        # significant for ASDL.
        self.rv.append(Id(s, self.lineno))

    def t_string(self, s):
        r'"[^"]*"'
        self.rv.append(String(s, self.lineno))

    def t_xxx(self, s): # not sure what this production means
        r"<="
        self.rv.append(Token(s, self.lineno))

    def t_punctuation(self, s):
        r"[\{\}\*\=\|\(\)\,\?\:]"
        self.rv.append(Token(s, self.lineno))

    def t_comment(self, s):
        r"\-\-[^\n]*"
        pass

    def t_newline(self, s):
        r"\n"
        self.lineno += 1

    def t_whitespace(self, s):
        r"[ \t]+"
        pass

    def t_default(self, s):
        r" . +"
        raise ValueError, "unmatched input: %s" % `s`

class ASDLParser(spark.GenericParser, object):
    def __init__(self):
        super(ASDLParser, self).__init__("module")

    def typestring(self, tok):
        return tok.type

    def error(self, tok):
        raise ASDLSyntaxError(tok.lineno, tok)

    def p_module_0(self, (module, name, version, _0, _1)):
        " module ::= Id Id version { } "
        if module.value != "module":
            raise ASDLSyntaxError(module.lineno,
                                  msg="expected 'module', found %s" % module)
        return Module(name, None, version)

    def p_module(self, (module, name, version, _0, definitions, _1)):
        " module ::= Id Id version { definitions } "
        if module.value != "module":
            raise ASDLSyntaxError(module.lineno,
                                  msg="expected 'module', found %s" % module)
        return Module(name, definitions, version)

    def p_version(self, (version, V)):
        "version ::= Id String"
        if version.value != "version":
            raise ASDLSyntaxError(version.lineno,
                                msg="expected 'version', found %" % version)
        return V

    def p_definition_0(self, (definition,)):
        " definitions ::= definition "
        return definition

    def p_definition_1(self, (definitions, definition)):
        " definitions ::= definition definitions "
        return definitions + definition

    def p_definition(self, (id, _, type)):
        " definition ::= Id = type "
        return [Type(id, type)]

    def p_type_0(self, (product,)):
        " type ::= product "
        return product

    def p_type_1(self, (sum,)):
        " type ::= sum "
        return Sum(sum)

    def p_type_2(self, (sum, id, _0, attributes, _1)):
        " type ::= sum Id ( fields ) "
        if id.value != "attributes":
            raise ASDLSyntaxError(id.lineno,
                                  msg="expected attributes, found %s" % id)
        if attributes:
            attributes.reverse()
        return Sum(sum, attributes)

    def p_product(self, (_0, fields, _1)):
        " product ::= ( fields ) "
        # XXX can't I just construct things in the right order?
        fields.reverse()
        return Product(fields)

    def p_sum_0(self, (constructor,)):
        " sum ::= constructor "
        return [constructor]

    def p_sum_1(self, (constructor, _, sum)):
        " sum ::= constructor | sum "
        return [constructor] + sum

    def p_sum_2(self, (constructor, _, sum)):
        " sum ::= constructor | sum "
        return [constructor] + sum

    def p_constructor_0(self, (id,)):
        " constructor ::= Id "
        return Constructor(id)

    def p_constructor_1(self, (id, _0, fields, _1)):
        " constructor ::= Id ( fields ) "
        # XXX can't I just construct things in the right order?
        fields.reverse()
        return Constructor(id, fields)

    def p_fields_0(self, (field,)):
        " fields ::= field "
        return [field]

    def p_fields_1(self, (field, _, fields)):
        " fields ::= field , fields "
        return fields + [field]

    def p_field_0(self, (type,)):
        " field ::= Id "
        return Field(type)

    def p_field_1(self, (type, name)):
        " field ::= Id Id "
        return Field(type, name)

    def p_field_2(self, (type, _, name)):
        " field ::= Id * Id "
        return Field(type, name, seq=True)

    def p_field_3(self, (type, _, name)):
        " field ::= Id ? Id "
        return Field(type, name, opt=True)

    def p_field_4(self, (type, _)):
        " field ::= Id * "
        return Field(type, seq=True)

    def p_field_5(self, (type, _)):
        " field ::= Id ? "
        return Field(type, opt=True)

builtin_types = ("identifier", "string", "int", "bool", "object")

# below is a collection of classes to capture the AST of an AST :-)
# not sure if any of the methods are useful yet, but I'm adding them
# piecemeal as they seem helpful

class AST(object):
    pass # a marker class

class Module(AST):
    def __init__(self, name, dfns, version):
        self.name = name
        self.dfns = dfns
        self.version = version
        self.types = {} # maps type name to value (from dfns)
        for type in dfns:
            self.types[type.name.value] = type.value

    def __repr__(self):
        return "Module(%s, %s)" % (self.name, self.dfns)

class Type(AST):
    def __init__(self, name, value):
        self.name = name
        self.value = value

    def __repr__(self):
        return "Type(%s, %s)" % (self.name, self.value)

class Constructor(AST):
    def __init__(self, name, fields=None):
        self.name = name
        self.fields = fields or []

    def __repr__(self):
        return "Constructor(%s, %s)" % (self.name, self.fields)

class Field(AST):
    def __init__(self, type, name=None, seq=False, opt=False):
        self.type = type
        self.name = name
        self.seq = seq
        self.opt = opt

    def __repr__(self):
        if self.seq:
            extra = ", seq=True"
        elif self.opt:
            extra = ", opt=True"
        else:
            extra = ""
        if self.name is None:
            return "Field(%s%s)" % (self.type, extra)
        else:
            return "Field(%s, %s%s)" % (self.type, self.name, extra)

class Sum(AST):
    def __init__(self, types, attributes=None):
        self.types = types
        self.attributes = attributes or []

    def __repr__(self):
        if self.attributes is None:
            return "Sum(%s)" % self.types
        else:
            return "Sum(%s, %s)" % (self.types, self.attributes)

class Product(AST):
    def __init__(self, fields):
        self.fields = fields

    def __repr__(self):
        return "Product(%s)" % self.fields

class VisitorBase(object):

    def __init__(self, skip=False):
        self.cache = {}
        self.skip = skip

    def visit(self, object, *args):
        meth = self._dispatch(object)
        if meth is None:
            return
        try:
            meth(object, *args)
        except Exception, err:
            print "Error visiting", repr(object)
            print err
            traceback.print_exc()
            # XXX hack
            if hasattr(self, 'file'):
                self.file.flush()
            os._exit(1)

    def _dispatch(self, object):
        assert isinstance(object, AST), repr(object)
        klass = object.__class__
        meth = self.cache.get(klass)
        if meth is None:
            methname = "visit" + klass.__name__
            if self.skip:
                meth = getattr(self, methname, None)
            else:
                meth = getattr(self, methname)
            self.cache[klass] = meth
        return meth

class Check(VisitorBase):

    def __init__(self):
        super(Check, self).__init__(skip=True)
        self.cons = {}
        self.errors = 0
        self.types = {}

    def visitModule(self, mod):
        for dfn in mod.dfns:
            self.visit(dfn)

    def visitType(self, type):
        self.visit(type.value, str(type.name))

    def visitSum(self, sum, name):
        for t in sum.types:
            self.visit(t, name)

    def visitConstructor(self, cons, name):
        key = str(cons.name)
        conflict = self.cons.get(key)
        if conflict is None:
            self.cons[key] = name
        else:
            print "Redefinition of constructor %s" % key
            print "Defined in %s and %s" % (conflict, name)
            self.errors += 1
        for f in cons.fields:
            self.visit(f, key)

    def visitField(self, field, name):
        key = str(field.type)
        l = self.types.setdefault(key, [])
        l.append(name)

    def visitProduct(self, prod, name):
        for f in prod.fields:
            self.visit(f, name)

def check(mod):
    v = Check()
    v.visit(mod)

    for t in v.types:
        if t not in mod.types and not t in builtin_types:
            v.errors += 1
            uses = ", ".join(v.types[t])
            print "Undefined type %s, used in %s" % (t, uses)

    return not v.errors

def parse(file):
    scanner = ASDLScanner()
    parser = ASDLParser()

    buf = open(file).read()
    tokens = scanner.tokenize(buf)
    try:
        return parser.parse(tokens)
    except ASDLSyntaxError, err:
        print err
        lines = buf.split("\n")
        print lines[err.lineno - 1] # lines starts at 0, files at 1

if __name__ == "__main__":
    import glob
    import sys

    if len(sys.argv) > 1:
        files = sys.argv[1:]
    else:
        testdir = "tests"
        files = glob.glob(testdir + "/*.asdl")

    for file in files:
        print file
        mod = parse(file)
        print "module", mod.name
        print len(mod.dfns), "definitions"
        if not check(mod):
            print "Check failed"
        else:
            for dfn in mod.dfns:
                print dfn.type


================================================
FILE: ast27/Parser/asdl_c.py
================================================
#! /usr/bin/env python
"""Generate C code from an ASDL description."""

# TO DO
# handle fields that have a type but no name

import os, sys

import asdl

TABSIZE = 8
MAX_COL = 100

def get_c_type(name):
    """Return a string for the C name of the type.

    This function special cases the default types provided by asdl:
    identifier, string, int, bool.
    """
    # XXX ack!  need to figure out where Id is useful and where string
    if isinstance(name, asdl.Id):
        name = name.value
    if name in asdl.builtin_types:
        return name
    else:
        return "%s_ty" % name

def reflow_lines(s, depth):
    """Reflow the line s indented depth tabs.

    Return a sequence of lines where no line extends beyond MAX_COL
    when properly indented.  The first line is properly indented based
    exclusively on depth * TABSIZE.  All following lines -- these are
    the reflowed lines generated by this function -- start at the same
    column as the first character beyond the opening { in the first
    line.
    """
    size = MAX_COL - depth * TABSIZE
    if len(s) < size:
        return [s]

    lines = []
    cur = s
    padding = ""
    while len(cur) > size:
        i = cur.rfind(' ', 0, size)
        # XXX this should be fixed for real
        if i == -1 and 'GeneratorExp' in cur:
            i = size + 3
        assert i != -1, "Impossible line %d to reflow: %r" % (size, s)
        lines.append(padding + cur[:i])
        if len(lines) == 1:
            # find new size based on brace
            j = cur.find('{', 0, i)
            if j >= 0:
                j += 2 # account for the brace and the space after it
                size -= j
                padding = " " * j
            else:
                j = cur.find('(', 0, i)
                if j >= 0:
                    j += 1 # account for the paren (no space after it)
                    size -= j
                    padding = " " * j
        cur = cur[i+1:]
    else:
        lines.append(padding + cur)
    return lines

def is_simple(sum):
    """Return True if a sum is a simple.

    A sum is simple if its types have no fields, e.g.
    unaryop = Invert | Not | UAdd | USub
    """
    for t in sum.types:
        if t.fields:
            return False
    return True


class EmitVisitor(asdl.VisitorBase):
    """Visit that emits lines"""

    def __init__(self, file):
        self.file = file
        super(EmitVisitor, self).__init__()

    def emit(self, s, depth, reflow=True):
        # XXX reflow long lines?
        if reflow:
            lines = reflow_lines(s, depth)
        else:
            lines = [s]
        for line in lines:
            line = (" " * TABSIZE * depth) + line + "\n"
            self.file.write(line)


class TypeDefVisitor(EmitVisitor):
    def visitModule(self, mod):
        for dfn in mod.dfns:
            self.visit(dfn)

    def visitType(self, type, depth=0):
        self.visit(type.value, type.name, depth)

    def visitSum(self, sum, name, depth):
        if is_simple(sum):
            self.simple_sum(sum, name, depth)
        else:
            self.sum_with_constructors(sum, name, depth)

    def simple_sum(self, sum, name, depth):
        enum = []
        for i in range(len(sum.types)):
            type = sum.types[i]
            enum.append("%s=%d" % (type.name, i + 1))
        enums = ", ".join(enum)
        ctype = get_c_type(name)
        s = "typedef enum _%s { %s } %s;" % (name, enums, ctype)
        self.emit(s, depth)
        self.emit("", depth)

    def sum_with_constructors(self, sum, name, depth):
        ctype = get_c_type(name)
        s = "typedef struct _%(name)s *%(ctype)s;" % locals()
        self.emit(s, depth)
        self.emit("", depth)

    def visitProduct(self, product, name, depth):
        ctype = get_c_type(name)
        s = "typedef struct _%(name)s *%(ctype)s;" % locals()
        self.emit(s, depth)
        self.emit("", depth)


class StructVisitor(EmitVisitor):
    """Visitor to generate typdefs for AST."""

    def visitModule(self, mod):
        for dfn in mod.dfns:
            self.visit(dfn)

    def visitType(self, type, depth=0):
        self.visit(type.value, type.name, depth)

    def visitSum(self, sum, name, depth):
        if not is_simple(sum):
            self.sum_with_constructors(sum, name, depth)

    def sum_with_constructors(self, sum, name, depth):
        def emit(s, depth=depth):
            self.emit(s % sys._getframe(1).f_locals, depth)
        enum = []
        for i in range(len(sum.types)):
            type = sum.types[i]
            enum.append("%s_kind=%d" % (type.name, i + 1))

        emit("enum _%(name)s_kind {" + ", ".join(enum) + "};")

        emit("struct _%(name)s {")
        emit("enum _%(name)s_kind kind;", depth + 1)
        emit("union {", depth + 1)
        for t in sum.types:
            self.visit(t, depth + 2)
        emit("} v;", depth + 1)
        for field in sum.attributes:
            # rudimentary attribute handling
            type = str(field.type)
            assert type in asdl.builtin_types, type
            emit("%s %s;" % (type, field.name), depth + 1);
        emit("};")
        emit("")

    def visitConstructor(self, cons, depth):
        if cons.fields:
            self.emit("struct {", depth)
            for f in cons.fields:
                self.visit(f, depth + 1)
            self.emit("} %s;" % cons.name, depth)
            self.emit("", depth)
        else:
            # XXX not sure what I want here, nothing is probably fine
            pass

    def visitField(self, field, depth):
        # XXX need to lookup field.type, because it might be something
        # like a builtin...
        ctype = get_c_type(field.type)
        name = field.name
        if field.seq:
            if field.type.value in ('cmpop',):
                self.emit("asdl_int_seq *%(name)s;" % locals(), depth)
            else:
                self.emit("asdl_seq *%(name)s;" % locals(), depth)
        else:
            self.emit("%(ctype)s %(name)s;" % locals(), depth)

    def visitProduct(self, product, name, depth):
        self.emit("struct _%(name)s {" % locals(), depth)
        for f in product.fields:
            self.visit(f, depth + 1)
        self.emit("};", depth)
        self.emit("", depth)


class PrototypeVisitor(EmitVisitor):
    """Generate function prototypes for the .h file"""

    def visitModule(self, mod):
        for dfn in mod.dfns:
            self.visit(dfn)

    def visitType(self, type):
        self.visit(type.value, type.name)

    def visitSum(self, sum, name):
        if is_simple(sum):
            pass # XXX
        else:
            for t in sum.types:
                self.visit(t, name, sum.attributes)

    def get_args(self, fields):
        """Return list of C argument into, one for each field.

        Argument info is 3-tuple of a C type, variable name, and flag
        that is true if type can be NULL.
        """
        args = []
        unnamed = {}
        for f in fields:
            if f.name is None:
                name = f.type
                c = unnamed[name] = unnamed.get(name, 0) + 1
                if c > 1:
                    name = "name%d" % (c - 1)
            else:
                name = f.name
            # XXX should extend get_c_type() to handle this
            if f.seq:
                if f.type.value in ('cmpop',):
                    ctype = "asdl_int_seq *"
                else:
                    ctype = "asdl_seq *"
            else:
                ctype = get_c_type(f.type)
            args.append((ctype, name, f.opt or f.seq))
        return args

    def visitConstructor(self, cons, type, attrs):
        args = self.get_args(cons.fields)
        attrs = self.get_args(attrs)
        ctype = get_c_type(type)
        self.emit_function(cons.name, ctype, args, attrs)

    def emit_function(self, name, ctype, args, attrs, union=True):
        args = args + attrs
        if args:
            argstr = ", ".join(["%s %s" % (atype, aname)
                                for atype, aname, opt in args])
            argstr += ", PyArena *arena"
        else:
            argstr = "PyArena *arena"
        margs = "a0"
        for i in range(1, len(args)+1):
            margs += ", a%d" % i
        self.emit("#define %s(%s) _Ta27_%s(%s)" % (name, margs, name, margs), 0,
                reflow=False)
        self.emit("%s _Ta27_%s(%s);" % (ctype, name, argstr), False)

    def visitProduct(self, prod, name):
        self.emit_function(name, get_c_type(name),
                           self.get_args(prod.fields), [], union=False)


class FunctionVisitor(PrototypeVisitor):
    """Visitor to generate constructor functions for AST."""

    def emit_function(self, name, ctype, args, attrs, union=True):
        def emit(s, depth=0, reflow=True):
            self.emit(s, depth, reflow)
        argstr = ", ".join(["%s %s" % (atype, aname)
                            for atype, aname, opt in args + attrs])
        if argstr:
            argstr += ", PyArena *arena"
        else:
            argstr = "PyArena *arena"
        self.emit("%s" % ctype, 0)
        emit("%s(%s)" % (name, argstr))
        emit("{")
        emit("%s p;" % ctype, 1)
        for argtype, argname, opt in args:
            # XXX hack alert: false is allowed for a bool
            if not opt and not (argtype == "bool" or argtype == "int"):
                emit("if (!%s) {" % argname, 1)
                emit("PyErr_SetString(PyExc_ValueError,", 2)
                msg = "field %s is required for %s" % (argname, name)
                emit('                "%s");' % msg,
                     2, reflow=False)
                emit('return NULL;', 2)
                emit('}', 1)

        emit("p = (%s)PyArena_Malloc(arena, sizeof(*p));" % ctype, 1);
        emit("if (!p)", 1)
        emit("return NULL;", 2)
        if union:
            self.emit_body_union(name, args, attrs)
        else:
            self.emit_body_struct(name, args, attrs)
        emit("return p;", 1)
        emit("}")
        emit("")

    def emit_body_union(self, name, args, attrs):
        def emit(s, depth=0, reflow=True):
            self.emit(s, depth, reflow)
        emit("p->kind = %s_kind;" % name, 1)
        for argtype, argname, opt in args:
            emit("p->v.%s.%s = %s;" % (name, argname, argname), 1)
        for argtype, argname, opt in attrs:
            emit("p->%s = %s;" % (argname, argname), 1)

    def emit_body_struct(self, name, args, attrs):
        def emit(s, depth=0, reflow=True):
            self.emit(s, depth, reflow)
        for argtype, argname, opt in args:
            emit("p->%s = %s;" % (argname, argname), 1)
        assert not attrs


class PickleVisitor(EmitVisitor):

    def visitModule(self, mod):
        for dfn in mod.dfns:
            self.visit(dfn)

    def visitType(self, type):
        self.visit(type.value, type.name)

    def visitSum(self, sum, name):
        pass

    def visitProduct(self, sum, name):
        pass

    def visitConstructor(self, cons, name):
        pass

    def visitField(self, sum):
        pass


class Obj2ModPrototypeVisitor(PickleVisitor):
    def visitProduct(self, prod, name):
        code = "static int obj2ast_%s(PyObject* obj, %s* out, PyArena* arena);"
        self.emit(code % (name, get_c_type(name)), 0)

    visitSum = visitProduct


class Obj2ModVisitor(PickleVisitor):
    def funcHeader(self, name):
        ctype = get_c_type(name)
        self.emit("int", 0)
        self.emit("obj2ast_%s(PyObject* obj, %s* out, PyArena* arena)" % (name, ctype), 0)
        self.emit("{", 0)
        self.emit("PyObject* tmp = NULL;", 1)
        self.emit("int isinstance;", 1)
        self.emit("", 0)

    def sumTrailer(self, name):
        self.emit("", 0)
        self.emit("tmp = PyObject_Repr(obj);", 1)
        # there's really nothing more we can do if this fails ...
        self.emit("if (tmp == NULL) goto failed;", 1)
        error = "expected some sort of %s, but got %%.400s" % name
        format = "PyErr_Format(PyExc_TypeError, \"%s\", _PyUnicode_AsString(tmp));"
        self.emit(format % error, 1, reflow=False)
        self.emit("failed:", 0)
        self.emit("Py_XDECREF(tmp);", 1)
        self.emit("return 1;", 1)
        self.emit("}", 0)
        self.emit("", 0)

    def simpleSum(self, sum, name):
        self.funcHeader(name)
        for t in sum.types:
            line = ("isinstance = PyObject_IsInstance(obj, "
                    "(PyObject *)%s_type);")
            self.emit(line % (t.name,), 1)
            self.emit("if (isinstance == -1) {", 1)
            self.emit("return 1;", 2)
            self.emit("}", 1)
            self.emit("if (isinstance) {", 1)
            self.emit("*out = %s;" % t.name, 2)
            self.emit("return 0;", 2)
            self.emit("}", 1)
        self.sumTrailer(name)

    def buildArgs(self, fields):
        return ", ".join(fields + ["arena"])

    def complexSum(self, sum, name):
        self.funcHeader(name)
        for a in sum.attributes:
            self.visitAttributeDeclaration(a, name, sum=sum)
        self.emit("", 0)
        # XXX: should we only do this for 'expr'?
        self.emit("if (obj == Py_None) {", 1)
        self.emit("*out = NULL;", 2)
        self.emit("return 0;", 2)
        self.emit("}", 1)
        for a in sum.attributes:
            self.visitField(a, name, sum=sum, depth=1)
        for t in sum.types:
            line = "isinstance = PyObject_IsInstance(obj, (PyObject*)%s_type);"
            self.emit(line % (t.name,), 1)
            self.emit("if (isinstance == -1) {", 1)
            self.emit("return 1;", 2)
            self.emit("}", 1)
            self.emit("if (isinstance) {", 1)
            for f in t.fields:
                self.visitFieldDeclaration(f, t.name, sum=sum, depth=2)
            self.emit("", 0)
            for f in t.fields:
                self.visitField(f, t.name, sum=sum, depth=2)
            args = [f.name.value for f in t.fields] + [a.name.value for a in sum.attributes]
            self.emit("*out = %s(%s);" % (t.name, self.buildArgs(args)), 2)
            self.emit("if (*out == NULL) goto failed;", 2)
            self.emit("return 0;", 2)
            self.emit("}", 1)
        self.sumTrailer(name)

    def visitAttributeDeclaration(self, a, name, sum=sum):
        ctype = get_c_type(a.type)
        self.emit("%s %s;" % (ctype, a.name), 1)

    def visitSum(self, sum, name):
        if is_simple(sum):
            self.simpleSum(sum, name)
        else:
            self.complexSum(sum, name)

    def visitProduct(self, prod, name):
        ctype = get_c_type(name)
        self.emit("int", 0)
        self.emit("obj2ast_%s(PyObject* obj, %s* out, PyArena* arena)" % (name, ctype), 0)
        self.emit("{", 0)
        self.emit("PyObject* tmp = NULL;", 1)
        for f in prod.fields:
            self.visitFieldDeclaration(f, name, prod=prod, depth=1)
        self.emit("", 0)
        for f in prod.fields:
            self.visitField(f, name, prod=prod, depth=1)
        args = [f.name.value for f in prod.fields]
        self.emit("*out = %s(%s);" % (name, self.buildArgs(args)), 1)
        self.emit("return 0;", 1)
        self.emit("failed:", 0)
        self.emit("Py_XDECREF(tmp);", 1)
        self.emit("return 1;", 1)
        self.emit("}", 0)
        self.emit("", 0)

    def visitFieldDeclaration(self, field, name, sum=None, prod=None, depth=0):
        ctype = get_c_type(field.type)
        if field.seq:
            if self.isSimpleType(field):
                self.emit("asdl_int_seq* %s;" % field.name, depth)
            else:
                self.emit("asdl_seq* %s;" % field.name, depth)
        else:
            ctype = get_c_type(field.type)
            self.emit("%s %s;" % (ctype, field.name), depth)

    def isSimpleSum(self, field):
        # XXX can the members of this list be determined automatically?
        return field.type.value in ('expr_context', 'boolop', 'operator',
                                    'unaryop', 'cmpop')

    def isNumeric(self, field):
        return get_c_type(field.type) in ("int", "bool")

    def isSimpleType(self, field):
        return self.isSimpleSum(field) or self.isNumeric(field)

    def visitField(self, field, name, sum=None, prod=None, depth=0):
        ctype = get_c_type(field.type)
        self.emit("if (PyObject_HasAttrString(obj, \"%s\")) {" % field.name, depth)
        self.emit("int res;", depth+1)
        if field.seq:
            self.emit("Py_ssize_t len;", depth+1)
            self.emit("Py_ssize_t i;", depth+1)
        self.emit("tmp = PyObject_GetAttrString(obj, \"%s\");" % field.name, depth+1)
        self.emit("if (tmp == NULL) goto failed;", depth+1)
        if field.seq:
            self.emit("if (!PyList_Check(tmp)) {", depth+1)
            self.emit("PyErr_Format(PyExc_TypeError, \"%s field \\\"%s\\\" must "
                      "be a list, not a %%.200s\", tmp->ob_type->tp_name);" %
                      (name, field.name),
                      depth+2, reflow=False)
            self.emit("goto failed;", depth+2)
            self.emit("}", depth+1)
            self.emit("len = PyList_GET_SIZE(tmp);", depth+1)
            if self.isSimpleType(field):
                self.emit("%s = asdl_int_seq_new(len, arena);" % field.name, depth+1)
            else:
                self.emit("%s = asdl_seq_new(len, arena);" % field.name, depth+1)
            self.emit("if (%s == NULL) goto failed;" % field.name, depth+1)
            self.emit("for (i = 0; i < len; i++) {", depth+1)
            self.emit("%s value;" % ctype, depth+2)
            self.emit("res = obj2ast_%s(PyList_GET_ITEM(tmp, i), &value, arena);" %
                      field.type, depth+2, reflow=False)
            self.emit("if (res != 0) goto failed;", depth+2)
            self.emit("asdl_seq_SET(%s, i, value);" % field.name, depth+2)
            self.emit("}", depth+1)
        else:
            self.emit("res = obj2ast_%s(tmp, &%s, arena);" %
                      (field.type, field.name), depth+1)
            self.emit("if (res != 0) goto failed;", depth+1)

        self.emit("Py_XDECREF(tmp);", depth+1)
        self.emit("tmp = NULL;", depth+1)
        self.emit("} else {", depth)
        if not field.opt:
            message = "required field \\\"%s\\\" missing from %s" % (field.name, name)
            format = "PyErr_SetString(PyExc_TypeError, \"%s\");"
            self.emit(format % message, depth+1, reflow=False)
            self.emit("return 1;", depth+1)
        else:
            if self.isNumeric(field):
                self.emit("%s = 0;" % field.name, depth+1)
            elif not self.isSimpleType(field):
                self.emit("%s = NULL;" % field.name, depth+1)
            else:
                raise TypeError("could not determine the default value for %s" % field.name)
        self.emit("}", depth)


class MarshalPrototypeVisitor(PickleVisitor):

    def prototype(self, sum, name):
        ctype = get_c_type(name)
        self.emit("static int marshal_write_%s(PyObject **, int *, %s);"
                  % (name, ctype), 0)

    visitProduct = visitSum = prototype


class PyTypesDeclareVisitor(PickleVisitor):

    def visitProduct(self, prod, name):
        self.emit("static PyTypeObject *%s_type;" % name, 0)
        self.emit("static PyObject* ast2obj_%s(void*);" % name, 0)
        if prod.fields:
            self.emit("static char *%s_fields[]={" % name,0)
            for f in prod.fields:
                self.emit('"%s",' % f.name, 1)
            self.emit("};", 0)

    def visitSum(self, sum, name):
        self.emit("static PyTypeObject *%s_type;" % name, 0)
        if sum.attributes:
            self.emit("static char *%s_attributes[] = {" % name, 0)
            for a in sum.attributes:
                self.emit('"%s",' % a.name, 1)
            self.emit("};", 0)
        ptype = "void*"
        if is_simple(sum):
            ptype = get_c_type(name)
            tnames = []
            for t in sum.types:
                tnames.append(str(t.name)+"_singleton")
            tnames = ", *".join(tnames)
            self.emit("static PyObject *%s;" % tnames, 0)
        self.emit("static PyObject* ast2obj_%s(%s);" % (name, ptype), 0)
        for t in sum.types:
            self.visitConstructor(t, name)

    def visitConstructor(self, cons, name):
        self.emit("static PyTypeObject *%s_type;" % cons.name, 0)
        if cons.fields:
            self.emit("static char *%s_fields[]={" % cons.name, 0)
            for t in cons.fields:
                self.emit('"%s",' % t.name, 1)
            self.emit("};",0)

class PyTypesVisitor(PickleVisitor):

    def visitModule(self, mod):
        self.emit("""
static int
ast_type_init(PyObject *self, PyObject *args, PyObject *kw)
{
    Py_ssize_t i, numfields = 0;
    int res = -1;
    PyObject *key, *value, *fields;
    fields = PyObject_GetAttrString((PyObject*)Py_TYPE(self), "_fields");
    if (!fields)
        PyErr_Clear();
    if (fields) {
        numfields = PySequence_Size(fields);
        if (numfields == -1)
            goto cleanup;
    }
    res = 0; /* if no error occurs, this stays 0 to the end */
    if (PyTuple_GET_SIZE(args) > 0) {
        if (numfields != PyTuple_GET_SIZE(args)) {
            PyErr_Format(PyExc_TypeError, "%.400s constructor takes %s"
                         "%zd positional argument%s",
                         Py_TYPE(self)->tp_name,
                         numfields == 0 ? "" : "either 0 or ",
                         numfields, numfields == 1 ? "" : "s");
            res = -1;
            goto cleanup;
        }
        for (i = 0; i < PyTuple_GET_SIZE(args); i++) {
            /* cannot be reached when fields is NULL */
            PyObject *name = PySequence_GetItem(fields, i);
            if (!name) {
                res = -1;
                goto cleanup;
            }
            res = PyObject_SetAttr(self, name, PyTuple_GET_ITEM(args, i));
            Py_DECREF(name);
            if (res < 0)
                goto cleanup;
        }
    }
    if (kw) {
        i = 0;  /* needed by PyDict_Next */
        while (PyDict_Next(kw, &i, &key, &value)) {
            res = PyObject_SetAttr(self, key, value);
            if (res < 0)
                goto cleanup;
        }
    }
  cleanup:
    Py_XDECREF(fields);
    return res;
}

/* Pickling support */
static PyObject *
ast_type_reduce(PyObject *self, PyObject *unused)
{
    PyObject *res;
    PyObject *dict = PyObject_GetAttrString(self, "__dict__");
    if (dict == NULL) {
        if (PyErr_ExceptionMatches(PyExc_AttributeError))
            PyErr_Clear();
        else
            return NULL;
    }
    if (dict) {
        res = Py_BuildValue("O()O", Py_TYPE(self), dict);
        Py_DECREF(dict);
        return res;
    }
    return Py_BuildValue("O()", Py_TYPE(self));
}

static PyMethodDef ast_type_methods[] = {
    {"__reduce__", ast_type_reduce, METH_NOARGS, NULL},
    {NULL}
};

static PyTypeObject AST_type = {
    PyVarObject_HEAD_INIT(NULL, 0)
    "typed_ast._ast27.AST",
    sizeof(PyObject),
    0,
    0,                       /* tp_dealloc */
    0,                       /* tp_print */
    0,                       /* tp_getattr */
    0,                       /* tp_setattr */
    0,                       /* tp_compare */
    0,                       /* tp_repr */
    0,                       /* tp_as_number */
    0,                       /* tp_as_sequence */
    0,                       /* tp_as_mapping */
    0,                       /* tp_hash */
    0,                       /* tp_call */
    0,                       /* tp_str */
    PyObject_GenericGetAttr, /* tp_getattro */
    PyObject_GenericSetAttr, /* tp_setattro */
    0,                       /* tp_as_buffer */
    Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, /* tp_flags */
    0,                       /* tp_doc */
    0,                       /* tp_traverse */
    0,                       /* tp_clear */
    0,                       /* tp_richcompare */
    0,                       /* tp_weaklistoffset */
    0,                       /* tp_iter */
    0,                       /* tp_iternext */
    ast_type_methods,        /* tp_methods */
    0,                       /* tp_members */
    0,                       /* tp_getset */
    0,                       /* tp_base */
    0,                       /* tp_dict */
    0,                       /* tp_descr_get */
    0,                       /* tp_descr_set */
    0,                       /* tp_dictoffset */
    (initproc)ast_type_init, /* tp_init */
    PyType_GenericAlloc,     /* tp_alloc */
    PyType_GenericNew,       /* tp_new */
    PyObject_Del,            /* tp_free */
};


static PyTypeObject* make_type(char *type, PyTypeObject* base, char**fields, int num_fields)
{
    PyObject *fnames, *result;
    int i;
    fnames = PyTuple_New(num_fields);
    if (!fnames) return NULL;
    for (i = 0; i < num_fields; i++) {
        PyObject *field = PyUnicode_FromString(fields[i]);
        if (!field) {
            Py_DECREF(fnames);
            return NULL;
        }
        PyTuple_SET_ITEM(fnames, i, field);
    }
    result = PyObject_CallFunction((PyObject*)&PyType_Type, "s(O){sOss}",
                    type, base, "_fields", fnames, "__module__", "typed_ast._ast27");
    Py_DECREF(fnames);
    return (PyTypeObject*)result;
}

static int add_attributes(PyTypeObject* type, char**attrs, int num_fields)
{
    int i, result;
    PyObject *s, *l = PyTuple_New(num_fields);
    if (!l)
        return 0;
    for (i = 0; i < num_fields; i++) {
        s = PyUnicode_FromString(attrs[i]);
        if (!s) {
            Py_DECREF(l);
            return 0;
        }
        PyTuple_SET_ITEM(l, i, s);
    }
    result = PyObject_SetAttrString((PyObject*)type, "_attributes", l) >= 0;
    Py_DECREF(l);
    return result;
}

/* Conversion AST -> Python */

static PyObject* ast2obj_list(asdl_seq *seq, PyObject* (*func)(void*))
{
    int i, n = asdl_seq_LEN(seq);
    PyObject *result = PyList_New(n);
    PyObject *value;
    if (!result)
        return NULL;
    for (i = 0; i < n; i++) {
        value = func(asdl_seq_GET(seq, i));
        if (!value) {
            Py_DECREF(result);
            return NULL;
        }
        PyList_SET_ITEM(result, i, value);
    }
    return result;
}

static PyObject* ast2obj_object(void *o)
{
    if (!o)
        o = Py_None;
    Py_INCREF((PyObject*)o);
    return (PyObject*)o;
}
#define ast2obj_identifier ast2obj_object
#define ast2obj_string ast2obj_object
static PyObject* ast2obj_bool(bool b)
{
    return PyBool_FromLong(b);
}

static PyObject* ast2obj_int(long b)
{
    return PyLong_FromLong(b);
}

/* Conversion Python -> AST */

static int obj2ast_object(PyObject* obj, PyObject** out, PyArena* arena)
{
    if (obj == Py_None)
        obj = NULL;
    if (obj)
        PyArena_AddPyObject(arena, obj);
    Py_XINCREF(obj);
    *out = obj;
    return 0;
}

static int obj2ast_identifier(PyObject* obj, PyObject** out, PyArena* arena)
{
    if (!PyUnicode_CheckExact(obj) && obj != Py_None) {
        PyErr_Format(PyExc_TypeError,
                    "AST identifier must be of type str");
        return 1;
    }
    return obj2ast_object(obj, out, arena);
}

static int obj2ast_string(PyObject* obj, PyObject** out, PyArena* arena)
{
    if (!PyUnicode_CheckExact(obj) && !PyUnicode_CheckExact(obj)) {
        PyErr_SetString(PyExc_TypeError,
                       "AST string must be of type str or unicode");
        return 1;
    }
    return obj2ast_object(obj, out, arena);
}

static int obj2ast_int(PyObject* obj, int* out, PyArena* arena)
{
    int i;
    if (!PyLong_Check(obj) && !PyLong_Check(obj)) {
        PyObject *s = PyObject_Repr(obj);
        if (s == NULL) return 1;
        PyErr_Format(PyExc_ValueError, "invalid integer value: %.400s",
                     _PyUnicode_AsString(s));
        Py_DECREF(s);
        return 1;
    }

    i = (int)PyLong_AsLong(obj);
    if (i == -1 && PyErr_Occurred())
        return 1;
    *out = i;
    return 0;
}

static int obj2ast_bool(PyObject* obj, bool* out, PyArena* arena)
{
    if (!PyBool_Check(obj)) {
        PyObject *s = PyObject_Repr(obj);
        if (s == NULL) return 1;
        PyErr_Format(PyExc_ValueError, "invalid boolean value: %.400s",
                     _PyUnicode_AsString(s));
        Py_DECREF(s);
        return 1;
    }

    *out = (obj == Py_True);
    return 0;
}

static int add_ast_fields(void)
{
    PyObject *empty_tuple, *d;
    if (PyType_Ready(&AST_type) < 0)
        return -1;
    d = AST_type.tp_dict;
    empty_tuple = PyTuple_New(0);
    if (!empty_tuple ||
        PyDict_SetItemString(d, "_fields", empty_tuple) < 0 ||
        PyDict_SetItemString(d, "_attributes", empty_tuple) < 0) {
        Py_XDECREF(empty_tuple);
        return -1;
    }
    Py_DECREF(empty_tuple);
    return 0;
}

""", 0, reflow=False)

        self.emit("static int init_types(void)",0)
        self.emit("{", 0)
        self.emit("static int initialized;", 1)
        self.emit("if (initialized) return 1;", 1)
        self.emit("if (add_ast_fields() < 0) return 0;", 1)
        for dfn in mod.dfns:
            self.visit(dfn)
        self.emit("initialized = 1;", 1)
        self.emit("return 1;", 1);
        self.emit("}", 0)

    def visitProduct(self, prod, name):
        if prod.fields:
            fields = name.value+"_fields"
        else:
            fields = "NULL"
        self.emit('%s_type = make_type("%s", &AST_type, %s, %d);' %
                        (name, name, fields, len(prod.fields)), 1)
        self.emit("if (!%s_type) return 0;" % name, 1)

    def visitSum(self, sum, name):
        self.emit('%s_type = make_type("%s", &AST_type, NULL, 0);' %
                  (name, name), 1)
        self.emit("if (!%s_type) return 0;" % name, 1)
        if sum.attributes:
            self.emit("if (!add_attributes(%s_type, %s_attributes, %d)) return 0;" %
                            (name, name, len(sum.attributes)), 1)
        else:
            self.emit("if (!add_attributes(%s_type, NULL, 0)) return 0;" % name, 1)
        simple = is_simple(sum)
        for t in sum.types:
            self.visitConstructor(t, name, simple)

    def visitConstructor(self, cons, name, simple):
        if cons.fields:
            fields = cons.name.value+"_fields"
        else:
            fields = "NULL"
        self.emit('%s_type = make_type("%s", %s_type, %s, %d);' %
                            (cons.name, cons.name, name, fields, len(cons.fields)), 1)
        self.emit("if (!%s_type) return 0;" % cons.name, 1)
        if simple:
            self.emit("%s_singleton = PyType_GenericNew(%s_type, NULL, NULL);" %
                             (cons.name, cons.name), 1)
            self.emit("if (!%s_singleton) return 0;" % cons.name, 1)


class ASTModuleVisitor(PickleVisitor):

    def visitModule(self, mod):
        self.emit('PyObject *ast27_parse(PyObject *self, PyObject *args);', 0)
        self.emit('static PyMethodDef ast27_methods[] = {', 0)
        self.emit('{"parse",  ast27_parse, METH_VARARGS, "Parse string into typed AST."},', 1)
        self.emit('{NULL, NULL, 0, NULL}', 1)
        self.emit('};', 0)

        self.emit("static struct PyModuleDef _astmodule27 = {", 0)
        self.emit('  PyModuleDef_HEAD_INIT, "_ast27", NULL, 0, ast27_methods', 0)
        self.emit("};", 0)
        self.emit("PyMODINIT_FUNC", 0)
        self.emit("PyInit__ast27(void)", 0)
        self.emit("{", 0)
        self.emit("PyObject *m, *d;", 1)
        self.emit("if (!init_types()) return NULL;", 1)
        self.emit('m = PyModule_Create(&_astmodule27);', 1)
        self.emit("if (!m) return NULL;", 1)
        self.emit("d = PyModule_GetDict(m);", 1)
        self.emit('if (PyDict_SetItemString(d, "AST", (PyObject*)&AST_type) < 0) return NULL;', 1)
        self.emit('if (PyModule_AddIntMacro(m, PyCF_ONLY_AST) < 0)', 1)
        self.emit("return NULL;", 2)
        for dfn in mod.dfns:
            self.visit(dfn)
        self.emit("return m;", 1)
        self.emit("}", 0)

    def visitProduct(self, prod, name):
        self.addObj(name)

    def visitSum(self, sum, name):
        self.addObj(name)
        for t in sum.types:
            self.visitConstructor(t, name)

    def visitConstructor(self, cons, name):
        self.addObj(cons.name)

    def addObj(self, name):
        self.emit('if (PyDict_SetItemString(d, "%s", (PyObject*)%s_type) < 0) return NULL;' % (name, name), 1)


_SPECIALIZED_SEQUENCES = ('stmt', 'expr')

def find_sequence(fields, doing_specialization):
    """Return True if any field uses a sequence."""
    for f in fields:
        if f.seq:
            if not doing_specialization:
                return True
            if str(f.type) not in _SPECIALIZED_SEQUENCES:
                return True
    return False

def has_sequence(types, doing_specialization):
    for t in types:
        if find_sequence(t.fields, doing_specialization):
            return True
    return False


class StaticVisitor(PickleVisitor):
    CODE = '''Very simple, always emit this static code.  Override CODE'''

    def visit(self, object):
        self.emit(self.CODE, 0, reflow=False)


class ObjVisitor(PickleVisitor):

    def func_begin(self, name):
        ctype = get_c_type(name)
        self.emit("PyObject*", 0)
        self.emit("ast2obj_%s(void* _o)" % (name), 0)
        self.emit("{", 0)
        self.emit("%s o = (%s)_o;" % (ctype, ctype), 1)
        self.emit("PyObject *result = NULL, *value = NULL;", 1)
        self.emit('if (!o) {', 1)
        self.emit("Py_INCREF(Py_None);", 2)
        self.emit('return Py_None;', 2)
        self.emit("}", 1)
        self.emit('', 0)

    def func_end(self):
        self.emit("return result;", 1)
        self.emit("failed:", 0)
        self.emit("Py_XDECREF(value);", 1)
        self.emit("Py_XDECREF(result);", 1)
        self.emit("return NULL;", 1)
        self.emit("}", 0)
        self.emit("", 0)

    def visitSum(self, sum, name):
        if is_simple(sum):
            self.simpleSum(sum, name)
            return
        self.func_begin(name)
        self.emit("switch (o->kind) {", 1)
        for i in range(len(sum.types)):
            t = sum.types[i]
            self.visitConstructor(t, i + 1, name)
        self.emit("}", 1)
        for a in sum.attributes:
            self.emit("value = ast2obj_%s(o->%s);" % (a.type, a.name), 1)
            self.emit("if (!value) goto failed;", 1)
            self.emit('if (PyObject_SetAttrString(result, "%s", value) < 0)' % a.name, 1)
            self.emit('goto failed;', 2)
            self.emit('Py_DECREF(value);', 1)
        self.func_end()

    def simpleSum(self, sum, name):
        self.emit("PyObject* ast2obj_%s(%s_ty o)" % (name, name), 0)
        self.emit("{", 0)
        self.emit("switch(o) {", 1)
        for t in sum.types:
            self.emit("case %s:" % t.name, 2)
            self.emit("Py_INCREF(%s_singleton);" % t.name, 3)
            self.emit("return %s_singleton;" % t.name, 3)
        self.emit("default:", 2)
        self.emit('/* should never happen, but just in case ... */', 3)
        code = "PyErr_Format(PyExc_SystemError, \"unknown %s found\");" % name
        self.emit(code, 3, reflow=False)
        self.emit("return NULL;", 3)
        self.emit("}", 1)
        self.emit("}", 0)

    def visitProduct(self, prod, name):
        self.func_begin(name)
        self.emit("result = PyType_GenericNew(%s_type, NULL, NULL);" % name, 1);
        self.emit("if (!result) return NULL;", 1)
        for field in prod.fields:
            self.visitField(field, name, 1, True)
        self.func_end()

    def visitConstructor(self, cons, enum, name):
        self.emit("case %s_kind:" % cons.name, 1)
        self.emit("result = PyType_GenericNew(%s_type, NULL, NULL);" % cons.name, 2);
        self.emit("if (!result) goto failed;", 2)
        for f in cons.fields:
            self.visitField(f, cons.name, 2, False)
        self.emit("break;", 2)

    def visitField(self, field, name, depth, product):
        def emit(s, d):
            self.emit(s, depth + d)
        if product:
            value = "o->%s" % field.name
        else:
            value = "o->v.%s.%s" % (name, field.name)
        self.set(field, value, depth)
        emit("if (!value) goto failed;", 0)
        emit('if (PyObject_SetAttrString(result, "%s", value) == -1)' % field.name, 0)
        emit("goto failed;", 1)
        emit("Py_DECREF(value);", 0)

    def emitSeq(self, field, value, depth, emit):
        emit("seq = %s;" % value, 0)
        emit("n = asdl_seq_LEN(seq);", 0)
        emit("value = PyList_New(n);", 0)
        emit("if (!value) goto failed;", 0)
        emit("for (i = 0; i < n; i++) {", 0)
        self.set("value", field, "asdl_seq_GET(seq, i)", depth + 1)
        emit("if (!value1) goto failed;", 1)
        emit("PyList_SET_ITEM(value, i, value1);", 1)
        emit("value1 = NULL;", 1)
        emit("}", 0)

    def set(self, field, value, depth):
        if field.seq:
            # XXX should really check for is_simple, but that requires a symbol table
            if field.type.value == "cmpop":
                # While the sequence elements are stored as void*,
                # ast2obj_cmpop expects an enum
                self.emit("{", depth)
                self.emit("int i, n = asdl_seq_LEN(%s);" % value, depth+1)
                self.emit("value = PyList_New(n);", depth+1)
                self.emit("if (!value) goto failed;", depth+1)
                self.emit("for(i = 0; i < n; i++)", depth+1)
                # This cannot fail, so no need for error handling
                self.emit("PyList_SET_ITEM(value, i, ast2obj_cmpop((cmpop_ty)asdl_seq_GET(%s, i)));" % value,
                          depth+2, reflow=False)
                self.emit("}", depth)
            else:
                self.emit("value = ast2obj_list(%s, ast2obj_%s);" % (value, field.type), depth)
        else:
            ctype = get_c_type(field.type)
            self.emit("value = ast2obj_%s(%s);" % (field.type, value), depth, reflow=False)


class PartingShots(StaticVisitor):

    CODE = """
PyObject* Ta27AST_mod2obj(mod_ty t)
{
    init_types();
    return ast2obj_mod(t);
}

/* mode is 0 for "exec", 1 for "eval" and 2 for "single" input */
mod_ty Ta27AST_obj2mod(PyObject* ast, PyArena* arena, int mode)
{
    mod_ty res;
    PyObject *req_type[3];
    char *req_name[3];
    int isinstance;

    req_type[0] = (PyObject*)Module_type;
    req_type[1] = (PyObject*)Expression_type;
    req_type[2] = (PyObject*)Interactive_type;

    req_name[0] = "Module";
    req_name[1] = "Expression";
    req_name[2] = "Interactive";

    assert(0 <= mode && mode <= 2);

    init_types();

    isinstance = PyObject_IsInstance(ast, req_type[mode]);
    if (isinstance == -1)
        return NULL;
    if (!isinstance) {
        PyErr_Format(PyExc_TypeError, "expected %s node, got %.400s",
                     req_name[mode], Py_TYPE(ast)->tp_name);
        return NULL;
    }
    if (obj2ast_mod(ast, &res, arena) != 0)
        return NULL;
    else
        return res;
}

int Ta27AST_Check(PyObject* obj)
{
    init_types();
    return PyObject_IsInstance(obj, (PyObject*)&AST_type);
}
"""

class ChainOfVisitors:
    def __init__(self, *visitors):
        self.visitors = visitors

    def visit(self, object):
        for v in self.visitors:
            v.visit(object)
            v.emit("", 0)

common_msg = "/* File automatically generated by %s. */\n\n"

c_file_msg = """
/*
   __version__ %s.

   This module must be committed separately after each AST grammar change;
   The __version__ number is set to the revision number of the commit
   containing the grammar change.
*/

"""

def main(srcfile):
    argv0 = sys.argv[0]
    components = argv0.split(os.sep)
    argv0 = os.sep.join(components[-2:])
    auto_gen_msg = common_msg % argv0
    mod = asdl.parse(srcfile)
    mod.version = "82160"
    if not asdl.check(mod):
        sys.exit(1)
    if INC_DIR:
        p = "%s/%s-ast.h" % (INC_DIR, mod.name)
        f = open(p, "wb")
        f.write(auto_gen_msg)
        f.write('#include "asdl.h"\n\n')
        c = ChainOfVisitors(TypeDefVisitor(f),
                            StructVisitor(f),
                            PrototypeVisitor(f),
                            )
        c.visit(mod)
        f.write("PyObject* Ta27AST_mod2obj(mod_ty t);\n")
        f.write("mod_ty Ta27AST_obj2mod(PyObject* ast, PyArena* arena, int mode);\n")
        f.write("int Ta27AST_Check(PyObject* obj);\n")
        f.close()

    if SRC_DIR:
        p = os.path.join(SRC_DIR, str(mod.name) + "-ast.c")
        f = open(p, "wb")
        f.write(auto_gen_msg)
        f.write(c_file_msg % mod.version)
        f.write('#include "Python.h"\n')
        f.write('#include "%s-ast.h"\n' % mod.name)
        f.write('\n')
        f.write("static PyTypeObject AST_type;\n")
        v = ChainOfVisitors(
            PyTypesDeclareVisitor(f),
            PyTypesVisitor(f),
            Obj2ModPrototypeVisitor(f),
            FunctionVisitor(f),
            ObjVisitor(f),
            Obj2ModVisitor(f),
            ASTModuleVisitor(f),
            PartingShots(f),
            )
        v.visit(mod)
        f.close()

if __name__ == "__main__":
    import sys
    import getopt

    INC_DIR = ''
    SRC_DIR = ''
    opts, args = getopt.getopt(sys.argv[1:], "h:c:")
    if len(opts) != 1:
        print "Must specify exactly one output file"
        sys.exit(1)
    for o, v in opts:
        if o == '-h':
            INC_DIR = v
        if o == '-c':
            SRC_DIR = v
    if len(args) != 1:
        print "Must specify single input file"
        sys.exit(1)
    main(args[0])


================================================
FILE: ast27/Parser/bitset.c
================================================

/* Bitset primitives used by the parser generator */

#include "../Include/pgenheaders.h"
#include "../Include/bitset.h"

bitset
newbitset(int nbits)
{
    int nbytes = NBYTES(nbits);
    bitset ss = (char *)PyObject_MALLOC(sizeof(BYTE) *  nbytes);

    if (ss == NULL)
        Py_FatalError("no mem for bitset");

    ss += nbytes;
    while (--nbytes >= 0)
        *--ss = 0;
    return ss;
}

void
delbitset(bitset ss)
{
    PyObject_FREE(ss);
}

int
addbit(bitset ss, int ibit)
{
    int ibyte = BIT2BYTE(ibit);
    BYTE mask = BIT2MASK(ibit);

    if (ss[ibyte] & mask)
        return 0; /* Bit already set */
    ss[ibyte] |= mask;
    return 1;
}

#if 0 /* Now a macro */
int
testbit(bitset ss, int ibit)
{
    return (ss[BIT2BYTE(ibit)] & BIT2MASK(ibit)) != 0;
}
#endif

int
samebitset(bitset ss1, bitset ss2, int nbits)
{
    int i;

    for (i = NBYTES(nbits); --i >= 0; )
        if (*ss1++ != *ss2++)
            return 0;
    return 1;
}

void
mergebitset(bitset ss1, bitset ss2, int nbits)
{
    int i;

    for (i = NBYTES(nbits); --i >= 0; )
        *ss1++ |= *ss2++;
}


================================================
FILE: ast27/Parser/grammar.c
================================================

/* Grammar implementation */

#include "Python.h"
#include "../Include/pgenheaders.h"

#include <ctype.h>

#include "../Include/token.h"
#include "../Include/grammar.h"

#ifdef RISCOS
#include <unixlib.h>
#endif

PyAPI_DATA(int) Py_DebugFlag;

grammar *
newgrammar(int start)
{
    grammar *g;

    g = (grammar *)PyObject_MALLOC(sizeof(grammar));
    if (g == NULL)
        Py_FatalError("no mem for new grammar");
    g->g_ndfas = 0;
    g->g_dfa = NULL;
    g->g_start = start;
    g->g_ll.ll_nlabels = 0;
    g->g_ll.ll_label = NULL;
    g->g_accel = 0;
    return g;
}

dfa *
adddfa(grammar *g, int type, char *name)
{
    dfa *d;

    g->g_dfa = (dfa *)PyObject_REALLOC(g->g_dfa,
                                        sizeof(dfa) * (g->g_ndfas + 1));
    if (g->g_dfa == NULL)
        Py_FatalError("no mem to resize dfa in adddfa");
    d = &g->g_dfa[g->g_ndfas++];
    d->d_type = type;
    d->d_name = strdup(name);
    d->d_nstates = 0;
    d->d_state = NULL;
    d->d_initial = -1;
    d->d_first = NULL;
    return d; /* Only use while fresh! */
}

int
addstate(dfa *d)
{
    state *s;

    d->d_state = (state *)PyObject_REALLOC(d->d_state,
                                  sizeof(state) * (d->d_nstates + 1));
    if (d->d_state == NULL)
        Py_FatalError("no mem to resize state in addstate");
    s = &d->d_state[d->d_nstates++];
    s->s_narcs = 0;
    s->s_arc = NULL;
    s->s_lower = 0;
    s->s_upper = 0;
    s->s_accel = NULL;
    s->s_accept = 0;
    return s - d->d_state;
}

void
addarc(dfa *d, int from, int to, int lbl)
{
    state *s;
    arc *a;

    assert(0 <= from && from < d->d_nstates);
    assert(0 <= to && to < d->d_nstates);

    s = &d->d_state[from];
    s->s_arc = (arc *)PyObject_REALLOC(s->s_arc, sizeof(arc) * (s->s_narcs + 1));
    if (s->s_arc == NULL)
        Py_FatalError("no mem to resize arc list in addarc");
    a = &s->s_arc[s->s_narcs++];
    a->a_lbl = lbl;
    a->a_arrow = to;
}

int
addlabel(labellist *ll, int type, char *str)
{
    int i;
    label *lb;

    for (i = 0; i < ll->ll_nlabels; i++) {
        if (ll->ll_label[i].lb_type == type &&
            strcmp(ll->ll_label[i].lb_str, str) == 0)
            return i;
    }
    ll->ll_label = (label *)PyObject_REALLOC(ll->ll_label,
                                    sizeof(label) * (ll->ll_nlabels + 1));
    if (ll->ll_label == NULL)
        Py_FatalError("no mem to resize labellist in addlabel");
    lb = &ll->ll_label[ll->ll_nlabels++];
    lb->lb_type = type;
    lb->lb_str = strdup(str);
    if (Py_DebugFlag)
        printf("Label @ %8p, %d: %s\n", ll, ll->ll_nlabels,
               Ta27Grammar_LabelRepr(lb));
    return lb - ll->ll_label;
}

/* Same, but rather dies than adds */

int
findlabel(labellist *ll, int type, char *str)
{
    int i;

    for (i = 0; i < ll->ll_nlabels; i++) {
        if (ll->ll_label[i].lb_type == type /*&&
            strcmp(ll->ll_label[i].lb_str, str) == 0*/)
            return i;
    }
    fprintf(stderr, "Label %d/'%s' not found\n", type, str);
    Py_FatalError("grammar.c:findlabel()");
    return 0; /* Make gcc -Wall happy */
}

/* Forward */
static void translabel(grammar *, label *);

void
translatelabels(grammar *g)
{
    int i;

#ifdef Py_DEBUG
    printf("Translating labels ...\n");
#endif
    /* Don't translate EMPTY */
    for (i = EMPTY+1; i < g->g_ll.ll_nlabels; i++)
        translabel(g, &g->g_ll.ll_label[i]);
}

static void
translabel(grammar *g, label *lb)
{
    int i;

    if (Py_DebugFlag)
        printf("Translating label %s ...\n", Ta27Grammar_LabelRepr(lb));

    if (lb->lb_type == NAME) {
        for (i = 0; i < g->g_ndfas; i++) {
            if (strcmp(lb->lb_str, g->g_dfa[i].d_name) == 0) {
                if (Py_DebugFlag)
                    printf(
                        "Label %s is non-terminal %d.\n",
                        lb->lb_str,
                        g->g_dfa[i].d_type);
                lb->lb_type = g->g_dfa[i].d_type;
                free(lb->lb_str);
                lb->lb_str = NULL;
                return;
            }
        }
        for (i = 0; i < (int)N_TOKENS; i++) {
            if (strcmp(lb->lb_str, _Ta27Parser_TokenNames[i]) == 0) {
                if (Py_DebugFlag)
                    printf("Label %s is terminal %d.\n",
                        lb->lb_str, i);
                lb->lb_type = i;
                free(lb->lb_str);
                lb->lb_str = NULL;
                return;
            }
        }
        printf("Can't translate NAME label '%s'\n", lb->lb_str);
        return;
    }

    if (lb->lb_type == STRING) {
        if (isalpha(Py_CHARMASK(lb->lb_str[1])) ||
            lb->lb_str[1] == '_') {
            char *p;
            char *src;
            char *dest;
            size_t name_len;
            if (Py_DebugFlag)
                printf("Label %s is a keyword\n", lb->lb_str);
            lb->lb_type = NAME;
            src = lb->lb_str + 1;
            p = strchr(src, '\'');
            if (p)
                name_len = p - src;
            else
                name_len = strlen(src);
            dest = (char *)malloc(name_len + 1);
            if (!dest) {
                printf("Can't alloc dest '%s'\n", src);
                return;
            }
            strncpy(dest, src, name_len);
            dest[name_len] = '\0';
            free(lb->lb_str);
            lb->lb_str = dest;
        }
        else if (lb->lb_str[2] == lb->lb_str[0]) {
            int type = (int) Ta27Token_OneChar(lb->lb_str[1]);
            if (type != OP) {
                lb->lb_type = type;
                free(lb->lb_str);
                lb->lb_str = NULL;
            }
            else
                printf("Unknown OP label %s\n",
                    lb->lb_str);
        }
        else if (lb->lb_str[2] && lb->lb_str[3] == lb->lb_str[0]) {
            int type = (int) Ta27Token_TwoChars(lb->lb_str[1],
                                       lb->lb_str[2]);
            if (type != OP) {
                lb->lb_type = type;
                free(lb->lb_str);
                lb->lb_str = NULL;
            }
            else
                printf("Unknown OP label %s\n",
                    lb->lb_str);
        }
        else if (lb->lb_str[2] && lb->lb_str[3] && lb->lb_str[4] == lb->lb_str[0]) {
            int type = (int) Ta27Token_ThreeChars(lb->lb_str[1],
                                                lb->lb_str[2],
                                                lb->lb_str[3]);
            if (type != OP) {
                lb->lb_type = type;
                free(lb->lb_str);
                lb->lb_str = NULL;
            }
            else
                printf("Unknown OP label %s\n",
                    lb->lb_str);
        }
        else
            printf("Can't translate STRING label %s\n",
                lb->lb_str);
    }
    else
        printf("Can't translate label '%s'\n",
               Ta27Grammar_LabelRepr(lb));
}


================================================
FILE: ast27/Parser/grammar1.c
================================================

/* Grammar subroutines needed by parser */

#include "Python.h"
#include "../Include/pgenheaders.h"
#include "../Include/grammar.h"
#include "../Include/token.h"

/* Return the DFA for the given type */

dfa *
Ta27Grammar_FindDFA(grammar *g, register int type)
{
    register dfa *d;
#if 1
    /* Massive speed-up */
    d = &g->g_dfa[type - NT_OFFSET];
    assert(d->d_type == type);
    return d;
#else
    /* Old, slow version */
    register int i;

    for (i = g->g_ndfas, d = g->g_dfa; --i >= 0; d++) {
        if (d->d_type == type)
            return d;
    }
    assert(0);
    /* NOTREACHED */
#endif
}

char *
Ta27Grammar_LabelRepr(label *lb)
{
    static char buf[100];

    if (lb->lb_type == ENDMARKER)
        return "EMPTY";
    else if (ISNONTERMINAL(lb->lb_type)) {
        if (lb->lb_str == NULL) {
            PyOS_snprintf(buf, sizeof(buf), "NT%d", lb->lb_type);
            return buf;
        }
        else
            return lb->lb_str;
    }
    else {
        if (lb->lb_str == NULL)
            return _Ta27Parser_TokenNames[lb->lb_type];
        else {
            PyOS_snprintf(buf, sizeof(buf), "%.32s(%.32s)",
                _Ta27Parser_TokenNames[lb->lb_type], lb->lb_str);
            return buf;
        }
    }
}


================================================
FILE: ast27/Parser/node.c
================================================
/* Parse tree node implementation */

#include "Python.h"
#include "../Include/node.h"
#include "../Include/errcode.h"

node *
Ta27Node_New(int type)
{
    node *n = (node *) PyObject_MALLOC(1 * sizeof(node));
    if (n == NULL)
        return NULL;
    n->n_type = type;
    n->n_str = NULL;
    n->n_lineno = 0;
    n->n_nchildren = 0;
    n->n_child = NULL;
    return n;
}

/* See comments at XXXROUNDUP below.  Returns -1 on overflow. */
static int
fancy_roundup(int n)
{
    /* Round up to the closest power of 2 >= n. */
    int result = 256;
    assert(n > 128);
    while (result < n) {
        result <<= 1;
        if (result <= 0)
            return -1;
    }
    return result;
}

/* A gimmick to make massive numbers of reallocs quicker.  The result is
 * a number >= the input.  In Ta27Node_AddChild, it's used like so, when
 * we're about to add child number current_size + 1:
 *
 *     if XXXROUNDUP(current_size) < XXXROUNDUP(current_size + 1):
 *         allocate space for XXXROUNDUP(current_size + 1) total children
 *     else:
 *         we already have enough space
 *
 * Since a node starts out empty, we must have
 *
 *     XXXROUNDUP(0) < XXXROUNDUP(1)
 *
 * so that we allocate space for the first child.  One-child nodes are very
 * common (presumably that would change if we used a more abstract form
 * of syntax tree), so to avoid wasting memory it's desirable that
 * XXXROUNDUP(1) == 1.  That in turn forces XXXROUNDUP(0) == 0.
 *
 * Else for 2 <= n <= 128, we round up to the closest multiple of 4.  Why 4?
 * Rounding up to a multiple of an exact power of 2 is very efficient, and
 * most nodes with more than one child have <= 4 kids.
 *
 * Else we call fancy_roundup() to grow proportionately to n.  We've got an
 * extreme case then (like test_longexp.py), and on many platforms doing
 * anything less than proportional growth leads to exorbitant runtime
 * (e.g., MacPython), or extreme fragmentation of user address space (e.g.,
 * Win98).
 *
 * In a run of compileall across the 2.3a0 Lib directory, Andrew MacIntyre
 * reported that, with this scheme, 89% of PyObject_REALLOC calls in
 * Ta27Node_AddChild passed 1 for the size, and 9% passed 4.  So this usually
 * wastes very little memory, but is very effective at sidestepping
 * platform-realloc disasters on vulnerable platforms.
 *
 * Note that this would be straightforward if a node stored its current
 * capacity.  The code is tricky to avoid that.
 */
#define XXXROUNDUP(n) ((n) <= 1 ? (n) :                 \
               (n) <= 128 ? (((n) + 3) & ~3) :          \
               fancy_roundup(n))


int
Ta27Node_AddChild(register node *n1, int type, char *str, int lineno, int col_offset)
{
    const int nch = n1->n_nchildren;
    int current_capacity;
    int required_capacity;
    node *n;

    if (nch == INT_MAX || nch < 0)
        return E_OVERFLOW;

    current_capacity = XXXROUNDUP(nch);
    required_capacity = XXXROUNDUP(nch + 1);
    if (current_capacity < 0 || required_capacity < 0)
        return E_OVERFLOW;
    if (current_capacity < required_capacity) {
        if ((size_t)required_capacity > PY_SIZE_MAX / sizeof(node)) {
            return E_NOMEM;
        }
        n = n1->n_child;
        n = (node *) PyObject_REALLOC(n,
                                      required_capacity * sizeof(node));
        if (n == NULL)
            return E_NOMEM;
        n1->n_child = n;
    }

    n = &n1->n_child[n1->n_nchildren++];
    n->n_type = type;
    n->n_str = str;
    n->n_lineno = lineno;
    n->n_col_offset = col_offset;
    n->n_nchildren = 0;
    n->n_child = NULL;
    return 0;
}

/* Forward */
static void freechildren(node *);
static Py_ssize_t sizeofchildren(node *n);


void
Ta27Node_Free(node *n)
{
    if (n != NULL) {
        freechildren(n);
        PyObject_FREE(n);
    }
}

Py_ssize_t
_Ta27Node_SizeOf(node *n)
{
    Py_ssize_t res = 0;

    if (n != NULL)
        res = sizeof(node) + sizeofchildren(n);
    return res;
}

static void
freechildren(node *n)
{
    int i;
    for (i = NCH(n); --i >= 0; )
        freechildren(CHILD(n, i));
    if (n->n_child != NULL)
        PyObject_FREE(n->n_child);
    if (STR(n) != NULL)
        PyObject_FREE(STR(n));
}

static Py_ssize_t
sizeofchildren(node *n)
{
    Py_ssize_t res = 0;
    int i;
    for (i = NCH(n); --i >= 0; )
        res += sizeofchildren(CHILD(n, i));
    if (n->n_child != NULL)
        /* allocated size of n->n_child array */
        res += XXXROUNDUP(NCH(n)) * sizeof(node);
    if (STR(n) != NULL)
        res += strlen(STR(n)) + 1;
    return res;
}


================================================
FILE: ast27/Parser/parser.c
================================================

/* Parser implementation */

/* For a description, see the comments at end of this file */

/* XXX To do: error recovery */

#include "Python.h"
#include "../Include/pgenheaders.h"
#include "../Include/token.h"
#include "../Include/grammar.h"
#include "../Include/node.h"
#include "parser.h"
#include "../Include/errcode.h"


#ifdef Py_DEBUG
PyAPI_DATA(int) Py_DebugFlag;
#define D(x) if (!Py_DebugFlag); else x
#else
#define D(x)
#endif


/* STACK DATA TYPE */

static void s_reset(stack *);

static void
s_reset(stack *s)
{
    s->s_top = &s->s_base[MAXSTACK];
}

#define s_empty(s) ((s)->s_top == &(s)->s_base[MAXSTACK])

static int
s_push(register stack *s, dfa *d, node *parent)
{
    register stackentry *top;
    if (s->s_top == s->s_base) {
        fprintf(stderr, "s_push: parser stack overflow\n");
        return E_NOMEM;
    }
    top = --s->s_top;
    top->s_dfa = d;
    top->s_parent = parent;
    top->s_state = 0;
    return 0;
}

#ifdef Py_DEBUG

static void
s_pop(register stack *s)
{
    if (s_empty(s))
        Py_FatalError("s_pop: parser stack underflow -- FATAL");
    s->s_top++;
}

#else /* !Py_DEBUG */

#define s_pop(s) (s)->s_top++

#endif


/* PARSER CREATION */

parser_state *
Ta27Parser_New(grammar *g, int start)
{
    parser_state *ps;

    if (!g->g_accel)
        Ta27Grammar_AddAccelerators(g);
    ps = (parser_state *)PyMem_MALLOC(sizeof(parser_state));
    if (ps == NULL)
        return NULL;
    ps->p_grammar = g;
#ifdef PY_PARSER_REQUIRES_FUTURE_KEYWORD
    ps->p_flags = 0;
#endif
    ps->p_tree = Ta27Node_New(start);
    if (ps->p_tree == NULL) {
        PyMem_FREE(ps);
        return NULL;
    }
    s_reset(&ps->p_stack);
    (void) s_push(&ps->p_stack, Ta27Grammar_FindDFA(g, start), ps->p_tree);
    return ps;
}

void
Ta27Parser_Delete(parser_state *ps)
{
    /* NB If you want to save the parse tree,
       you must set p_tree to NULL before calling delparser! */
    Ta27Node_Free(ps->p_tree);
    PyMem_FREE(ps);
}


/* PARSER STACK OPERATIONS */

static int
shift(register stack *s, int type, char *str, int newstate, int lineno, int col_offset)
{
    int err;
    assert(!s_empty(s));
    err = Ta27Node_AddChild(s->s_top->s_parent, type, str, lineno, col_offset);
    if (err)
        return err;
    s->s_top->s_state = newstate;
    return 0;
}

static int
push(register stack *s, int type, dfa *d, int newstate, int lineno, int col_offset)
{
    int err;
    register node *n;
    n = s->s_top->s_parent;
    assert(!s_empty(s));
    err = Ta27Node_AddChild(n, type, (char *)NULL, lineno, col_offset);
    if (err)
        return err;
    s->s_top->s_state = newstate;
    return s_push(s, d, CHILD(n, NCH(n)-1));
}


/* PARSER PROPER */

static int
classify(parser_state *ps, int type, char *str)
{
    grammar *g = ps->p_grammar;
    register int n = g->g_ll.ll_nlabels;

    if (type == NAME) {
        register char *s = str;
        register label *l = g->g_ll.ll_label;
        register int i;
        for (i = n; i > 0; i--, l++) {
            if (l->lb_type != NAME || l->lb_str == NULL ||
                l->lb_str[0] != s[0] ||
                strcmp(l->lb_str, s) != 0)
                continue;
#ifdef PY_PARSER_REQUIRES_FUTURE_KEYWORD
            if (ps->p_flags & CO_FUTURE_PRINT_FUNCTION &&
                s[0] == 'p' && strcmp(s, "print") == 0) {
                break; /* no longer a keyword */
            }
#endif
            D(printf("It's a keyword\n"));
            return n - i;
        }
    }

    {
        register label *l = g->g_ll.ll_label;
        register int i;
        for (i = n; i > 0; i--, l++) {
            if (l->lb_type == type && l->lb_str == NULL) {
                D(printf("It's a token we know\n"));
                return n - i;
            }
        }
    }

    D(printf("Illegal token\n"));
    return -1;
}

#ifdef PY_PARSER_REQUIRES_FUTURE_KEYWORD
static void
future_hack(parser_state *ps)
{
    node *n = ps->p_stack.s_top->s_parent;
    node *ch, *cch;
    int i;

    /* from __future__ import ..., must have at least 4 children */
    n = CHILD(n, 0);
    if (NCH(n) < 4)
        return;
    ch = CHILD(n, 0);
    if (STR(ch) == NULL || strcmp(STR(ch), "from") != 0)
        return;
    ch = CHILD(n, 1);
    if (NCH(ch) == 1 && STR(CHILD(ch, 0)) &&
        strcmp(STR(CHILD(ch, 0)), "__future__") != 0)
        return;
    ch = CHILD(n, 3);
    /* ch can be a star, a parenthesis or import_as_names */
    if (TYPE(ch) == STAR)
        return;
    if (TYPE(ch) == LPAR)
        ch = CHILD(n, 4);

    for (i = 0; i < NCH(ch); i += 2) {
        cch = CHILD(ch, i);
        if (NCH(cch) >= 1 && TYPE(CHILD(cch, 0)) == NAME) {
            char *str_ch = STR(CHILD(cch, 0));
            if (strcmp(str_ch, FUTURE_WITH_STATEMENT) == 0) {
                ps->p_flags |= CO_FUTURE_WITH_STATEMENT;
            } else if (strcmp(str_ch, FUTURE_PRINT_FUNCTION) == 0) {
                ps->p_flags |= CO_FUTURE_PRINT_FUNCTION;
            } else if (strcmp(str_ch, FUTURE_UNICODE_LITERALS) == 0) {
                ps->p_flags |= CO_FUTURE_UNICODE_LITERALS;
            }
        }
    }
}
#endif /* future keyword */

int
Ta27Parser_AddToken(register parser_state *ps, register int type, char *str,
                  int lineno, int col_offset, int *expected_ret)
{
    register int ilabel;
    int err;

    D(printf("Token %s/'%s' ... ", _Ta27Parser_TokenNames[type], str));

    /* Find out which label this token is */
    ilabel = classify(ps, type, str);
    if (ilabel < 0)
        return E_SYNTAX;

    /* Loop until the token is shifted or an error occurred */
    for (;;) {
        /* Fetch the current dfa and state */
        register dfa *d = ps->p_stack.s_top->s_dfa;
        register state *s = &d->d_state[ps->p_stack.s_top->s_state];

        D(printf(" DFA '%s', state %d:",
            d->d_name, ps->p_stack.s_top->s_state));

        /* Check accelerator */
        if (s->s_lower <= ilabel && ilabel < s->s_upper) {
            register int x = s->s_accel[ilabel - s->s_lower];
            if (x != -1) {
                if (x & (1<<7)) {
                    /* Push non-terminal */
                    int nt = (x >> 8) + NT_OFFSET;
                    int arrow = x & ((1<<7)-1);
                    dfa *d1 = Ta27Grammar_FindDFA(
                        ps->p_grammar, nt);
                    if ((err = push(&ps->p_stack, nt, d1,
                        arrow, lineno, col_offset)) > 0) {
                        D(printf(" MemError: push\n"));
                        return err;
                    }
                    D(printf(" Push ...\n"));
                    continue;
                }

                /* Shift the token */
                if ((err = shift(&ps->p_stack, type, str,
                                x, lineno, col_offset)) > 0) {
                    D(printf(" MemError: shift.\n"));
                    return err;
                }
                D(printf(" Shift.\n"));
                /* Pop while we are in an accept-only state */
                while (s = &d->d_state
                                [ps->p_stack.s_top->s_state],
                    s->s_accept && s->s_narcs == 1) {
                    D(printf("  DFA '%s', state %d: "
                             "Direct pop.\n",
                             d->d_name,
                             ps->p_stack.s_top->s_state));
#ifdef PY_PARSER_REQUIRES_FUTURE_KEYWORD
                    if (d->d_name[0] == 'i' &&
                        strcmp(d->d_name,
                           "import_stmt") == 0)
                        future_hack(ps);
#endif
                    s_pop(&ps->p_stack);
                    if (s_empty(&ps->p_stack)) {
                        D(printf("  ACCEPT.\n"));
                        return E_DONE;
                    }
                    d = ps->p_stack.s_top->s_dfa;
                }
                return E_OK;
            }
        }

        if (s->s_accept) {
#ifdef PY_PARSER_REQUIRES_FUTURE_KEYWORD
            if (d->d_name[0] == 'i' &&
                strcmp(d->d_name, "import_stmt") == 0)
                future_hack(ps);
#endif
            /* Pop this dfa and try again */
            s_pop(&ps->p_stack);
            D(printf(" Pop ...\n"));
            if (s_empty(&ps->p_stack)) {
                D(printf(" Error: bottom of stack.\n"));
                return E_SYNTAX;
            }
            continue;
        }

        /* Stuck, report syntax error */
        D(printf(" Error.\n"));
        if (expected_ret) {
            if (s->s_lower == s->s_upper - 1) {
                /* Only one possible expected token */
                *expected_ret = ps->p_grammar->
                    g_ll.ll_label[s->s_lower].lb_type;
            }
            else
                *expected_ret = -1;
        }
        return E_SYNTAX;
    }
}


#ifdef Py_DEBUG

/* DEBUG OUTPUT */

void
dumptree(grammar *g, node *n)
{
    int i;

    if (n == NULL)
        printf("NIL");
    else {
        label l;
        l.lb_type = TYPE(n);
        l.lb_str = STR(n);
        printf("%s", Ta27Grammar_LabelRepr(&l));
        if (ISNONTERMINAL(TYPE(n))) {
            printf("(");
            for (i = 0; i < NCH(n); i++) {
                if (i > 0)
                    printf(",");
                dumptree(g, CHILD(n, i));
            }
            printf(")");
        }
    }
}

void
showtree(grammar *g, node *n)
{
    int i;

    if (n == NULL)
        return;
    if (ISNONTERMINAL(TYPE(n))) {
        for (i = 0; i < NCH(n); i++)
            showtree(g, CHILD(n, i));
    }
    else if (ISTERMINAL(TYPE(n))) {
        printf("%s", _Ta27Parser_TokenNames[TYPE(n)]);
        if (TYPE(n) == NUMBER || TYPE(n) == NAME)
            printf("(%s)", STR(n));
        printf(" ");
    }
    else
        printf("? ");
}

#endif /* Py_DEBUG */

/*

Description
-----------

The parser's interface is different than usual: the function addtoken()
must be called for each token in the input.  This makes it possible to
turn it into an incremental parsing system later.  The parsing system
constructs a parse tree as it goes.

A parsing rule is represented as a Deterministic Finite-state Automaton
(DFA).  A node in a DFA represents a state of the parser; an arc represents
a transition.  Transitions are either labeled with terminal symbols or
with non-terminals.  When the parser decides to follow an arc labeled
with a non-terminal, it is invoked recursively with the DFA representing
the parsing rule for that as its initial state; when that DFA accepts,
the parser that invoked it continues.  The parse tree constructed by the
recursively called parser is inserted as a child in the current parse tree.

The DFA's can be constructed automatically from a more conventional
language description.  An extended LL(1) grammar (ELL(1)) is suitable.
Certain restrictions make the parser's life easier: rules that can produce
the empty string should be outlawed (there are other ways to put loops
or optional parts in the language).  To avoid the need to construct
FIRST sets, we can require that all but the last alternative of a rule
(really: arc going out of a DFA's state) must begin with a terminal
symbol.

As an example, consider this grammar:

expr:   term (OP term)*
term:   CONSTANT | '(' expr ')'

The DFA corresponding to the rule for expr is:

------->.---term-->.------->
    ^          |
    |          |
    \----OP----/

The parse tree generated for the input a+b is:

(expr: (term: (NAME: a)), (OP: +), (term: (NAME: b)))

*/


================================================
FILE: ast27/Parser/parser.h
================================================
#ifndef Ta27_PARSER_H
#define Ta27_PARSER_H
#ifdef __cplusplus
extern "C" {
#endif


/* Parser interface */

#define MAXSTACK 1500

typedef struct {
	int		 s_state;	/* State in current DFA */
	dfa		*s_dfa;		/* Current DFA */
	struct _node	*s_parent;	/* Where to add next node */
} stackentry;

typedef struct {
	stackentry	*s_top;		/* Top entry */
	stackentry	 s_base[MAXSTACK];/* Array of stack entries */
					/* NB The stack grows down */
} stack;

typedef struct {
	stack	 	p_stack;	/* Stack of parser states */
	grammar		*p_grammar;	/* Grammar to use */
	node		*p_tree;	/* Top of parse tree */
#ifdef PY_PARSER_REQUIRES_FUTURE_KEYWORD
	unsigned long	p_flags;	/* see co_flags in Include/code.h */
#endif
} parser_state;

parser_state *Ta27Parser_New(grammar *g, int start);
void Ta27Parser_Delete(parser_state *ps);
int Ta27Parser_AddToken(parser_state *ps, int type, char *str, int lineno, int col_offset,
                      int *expected_ret);
void Ta27Grammar_AddAccelerators(grammar *g);

#ifdef __cplusplus
}
#endif
#endif /* !Ta27_PARSER_H */


================================================
FILE: ast27/Parser/parsetok.c
================================================

/* Parser-tokenizer link implementation */

#include "../Include/pgenheaders.h"
#include "tokenizer.h"
#include "../Include/node.h"
#include "../Include/grammar.h"
#include "parser.h"
#include "../Include/parsetok.h"
#include "../Include/errcode.h"
#include "../Include/graminit.h"

int Ta27_TabcheckFlag;


/* Forward */
static node *parsetok(struct tok_state *, grammar *, int, perrdetail *, int *);
static void initerr(perrdetail *err_ret, const char* filename);
static int initerr_object(perrdetail *err_ret, PyObject *filename);

/* Parse input coming from a string.  Return error code, print some errors. */
node *
Ta27Parser_ParseString(const char *s, grammar *g, int start, perrdetail *err_ret)
{
    return Ta27Parser_ParseStringFlagsFilename(s, NULL, g, start, err_ret, 0);
}

node *
Ta27Parser_ParseStringFlags(const char *s, grammar *g, int start,
                          perrdetail *err_ret, int flags)
{
    return Ta27Parser_ParseStringFlagsFilename(s, NULL,
                                             g, start, err_ret, flags);
}

node *
Ta27Parser_ParseStringFlagsFilename(const char *s, const char *filename,
                          grammar *g, int start,
                          perrdetail *err_ret, int flags)
{
    int iflags = flags;
    return Ta27Parser_ParseStringFlagsFilenameEx(s, filename, g, start,
                                               err_ret, &iflags);
}

node *
Ta27Parser_ParseStringFlagsFilenameEx(const char *s, const char *filename,
                          grammar *g, int start,
                          perrdetail *err_ret, int *flags)
{
    struct tok_state *tok;

    initerr(err_ret, filename);

    if ((tok = Ta27Tokenizer_FromString(s, start == file_input)) == NULL) {
        err_ret->error = PyErr_Occurred() ? E_DECODE : E_NOMEM;
        return NULL;
    }

    tok->filename = filename ? filename : "<string>";
    if (Ta27_TabcheckFlag || Py_VerboseFlag) {
        tok->altwarning = (tok->filename != NULL);
        if (Ta27_TabcheckFlag >= 2)
            tok->alterror++;
    }

    return parsetok(tok, g, start, err_ret, flags);
}

node *
Ta27Parser_ParseStringObject(const char *s, PyObject *filename,
                           grammar *g, int start,
                           perrdetail *err_ret, int *flags)
{
    struct tok_state *tok;
    int exec_input = start == file_input;

    initerr_object(err_ret, filename);

    if (*flags & PyPARSE_IGNORE_COOKIE)
        tok = Ta27Tokenizer_FromUTF8(s, exec_input);
    else
        tok = Ta27Tokenizer_FromString(s, exec_input);

    if (tok == NULL) {
        err_ret->error = PyErr_Occurred() ? E_DECODE : E_NOMEM;
        return NULL;
    }

#ifndef PGEN
    Py_INCREF(err_ret->filename);
    tok->filename = PyUnicode_AsUTF8(err_ret->filename);
#endif
    return parsetok(tok, g, start, err_ret, flags);
}

/* Parse input coming from a file.  Return error code, print some errors. */

node *
Ta27Parser_ParseFile(FILE *fp, const char *filename, grammar *g, int start,
                   char *ps1, char *ps2, perrdetail *err_ret)
{
    return Ta27Parser_ParseFileFlags(fp, filename, g, start, ps1, ps2,
                                   err_ret, 0);
}

node *
Ta27Parser_ParseFileFlags(FILE *fp, const char *filename, grammar *g, int start,
                        char *ps1, char *ps2, perrdetail *err_ret, int flags)
{
    int iflags = flags;
    return Ta27Parser_ParseFileFlagsEx(fp, filename, g, start, ps1, ps2, err_ret, &iflags);
}

node *
Ta27Parser_ParseFileFlagsEx(FILE *fp, const char *filename, grammar *g, int start,
                          char *ps1, char *ps2, perrdetail *err_ret, int *flags)
{
    struct tok_state *tok;

    initerr(err_ret, filename);

    if ((tok = Ta27Tokenizer_FromFile(fp, ps1, ps2)) == NULL) {
        err_ret->error = E_NOMEM;
        return NULL;
    }
    tok->filename = filename;
    if (Ta27_TabcheckFlag || Py_VerboseFlag) {
        tok->altwarning = (filename != NULL);
        if (Ta27_TabcheckFlag >= 2)
            tok->alterror++;
    }

    return parsetok(tok, g, start, err_ret, flags);
}

#if 0
static char with_msg[] =
"%s:%d: Warning: 'with' will become a reserved keyword in Python 2.6\n";

static char as_msg[] =
"%s:%d: Warning: 'as' will become a reserved keyword in Python 2.6\n";

static void
warn(const char *msg, const char *filename, int lineno)
{
    if (filename == NULL)
        filename = "<string>";
    PySys_WriteStderr(msg, filename, lineno);
}
#endif


typedef struct {
    struct {
        int lineno;
        char *comment;
    } *items;
    size_t size;
    size_t num_items;
} growable_comment_array;

static int
growable_comment_array_init(growable_comment_array *arr, size_t initial_size) {
    assert(initial_size > 0);
    arr->items = malloc(initial_size * sizeof(*arr->items));
    arr->size = initial_size;
    arr->num_items = 0;

    return arr->items != NULL;
}

static int
growable_comment_array_add(growable_comment_array *arr, int lineno, char *comment) {
    if (arr->num_items >= arr->size) {
        arr->size *= 2;
        arr->items = realloc(arr->items, arr->size * sizeof(*arr->items));
        if (!arr->items) {
            return 0;
        }
    }

    arr->items[arr->num_items].lineno = lineno;
    arr->items[arr->num_items].comment = comment;
    arr->num_items++;
    return 1;
}

static void
growable_comment_array_deallocate(growable_comment_array *arr) {
    unsigned i;
    for (i = 0; i < arr->num_items; i++) {
        PyObject_FREE(arr->items[i].comment);
    }
    free(arr->items);
}


/* Parse input coming from the given tokenizer structure.
   Return error code. */

static node *
parsetok(struct tok_state *tok, grammar *g, int start, perrdetail *err_ret,
         int *flags)
{
    parser_state *ps;
    node *n;
    int started = 0;

    growable_comment_array type_ignores;
    if (!growable_comment_array_init(&type_ignores, 10)) {
        err_ret->error = E_NOMEM;
        Ta27Tokenizer_Free(tok);
        return NULL;
    }

    if ((ps = Ta27Parser_New(g, start)) == NULL) {
        fprintf(stderr, "no mem for new parser\n");
        err_ret->error = E_NOMEM;
        Ta27Tokenizer_Free(tok);
        return NULL;
    }
#ifdef PY_PARSER_REQUIRES_FUTURE_KEYWORD
    if (*flags & PyPARSE_PRINT_IS_FUNCTION) {
        ps->p_flags |= CO_FUTURE_PRINT_FUNCTION;
    }
    if (*flags & PyPARSE_UNICODE_LITERALS) {
        ps->p_flags |= CO_FUTURE_UNICODE_LITERALS;
    }

#endif

    for (;;) {
        char *a, *b;
        int type;
        size_t len;
        char *str;
        int col_offset;

        type = Ta27Tokenizer_Get(tok, &a, &b);
        if (type == ERRORTOKEN) {
            err_ret->error = tok->done;
            break;
        }
        if (type == ENDMARKER && started) {
            type = NEWLINE; /* Add an extra newline */
            started = 0;
            /* Add the right number of dedent tokens,
               except if a certain flag is given --
               codeop.py uses this. */
            if (tok->indent &&
                !(*flags & PyPARSE_DONT_IMPLY_DEDENT))
            {
                tok->pendin = -tok->indent;
                tok->indent = 0;
            }
        }
        else
            started = 1;
        len = b - a; /* XXX this may compute NULL - NULL */
        str = (char *) PyObject_MALLOC(len + 1);
        if (str == NULL) {
            fprintf(stderr, "no mem for next token\n");
            err_ret->error = E_NOMEM;
            break;
        }
        if (len > 0)
            strncpy(str, a, len);
        str[len] = '\0';

#ifdef PY_PARSER_REQUIRES_FUTURE_KEYWORD
#endif
        if (a >= tok->line_start)
            col_offset = a - tok->line_start;
        else
            col_offset = -1;

        if (type == TYPE_IGNORE) {
            if (!growable_comment_array_add(&type_ignores, tok->lineno, str)) {
                err_ret->error = E_NOMEM;
                break;
            }
            continue;
        }

        if ((err_ret->error =
             Ta27Parser_AddToken(ps, (int)type, str, tok->lineno, col_offset,
                               &(err_ret->expected))) != E_OK) {
            if (err_ret->error != E_DONE) {
                PyObject_FREE(str);
                err_ret->token = type;
            }
            break;
        }
    }

    if (err_ret->error == E_DONE) {
        n = ps->p_tree;
        ps->p_tree = NULL;

        if (n->n_type == file_input) {
            /* Put type_ignore nodes in the ENDMARKER of file_input. */
            int num;
            node *ch;
            size_t i;

            num = NCH(n);
            ch = CHILD(n, num - 1);
            REQ(ch, ENDMARKER);

            for (i = 0; i < type_ignores.num_items; i++) {
                int res = Ta27Node_AddChild(ch, TYPE_IGNORE, type_ignores.items[i].comment,
                                            type_ignores.items[i].lineno, 0);
                if (res != 0) {
                    err_ret->error = res;
                    Ta27Node_Free(n);
                    n = NULL;
                    break;
                }
                type_ignores.items[i].comment = NULL;
            }
        }
    }
    else
        n = NULL;

    growable_comment_array_deallocate(&type_ignores);

#ifdef PY_PARSER_REQUIRES_FUTURE_KEYWORD
    *flags = ps->p_flags;
#endif
    Ta27Parser_Delete(ps);

    if (n == NULL) {
        if (tok->lineno <= 1 && tok->done == E_EOF)
            err_ret->error = E_EOF;
        err_ret->lineno = tok->lineno;
        if (tok->buf != NULL) {
            char *text = NULL;
            size_t len;
            assert(tok->cur - tok->buf < INT_MAX);
            err_ret->offset = (int)(tok->cur - tok->buf);
            len = tok->inp - tok->buf;
#ifdef Py_USING_UNICODE
            text = Ta27Tokenizer_RestoreEncoding(tok, len, &err_ret->offset);

#endif
            if (text == NULL) {
                text = (char *) PyObject_MALLOC(len + 1);
                if (text != NULL) {
                    if (len > 0)
                        strncpy(text, tok->buf, len);
                    text[len] = '\0';
                }
            }
            err_ret->text = text;
        }
    } else if (tok->encoding != NULL) {
        /* 'nodes->n_str' uses PyObject_*, while 'tok->encoding' was
         * allocated using PyMem_
         */
        node* r = Ta27Node_New(encoding_decl);
        if (r)
            r->n_str = PyObject_MALLOC(strlen(tok->encoding)+1);
        if (!r || !r->n_str) {
            err_ret->error = E_NOMEM;
            if (r)
                PyObject_FREE(r);
            n = NULL;
            goto done;
        }
        strcpy(r->n_str, tok->encoding);
        PyMem_FREE(tok->encoding);
        tok->encoding = NULL;
        r->n_nchildren = 1;
        r->n_child = n;
        n = r;
    }

done:
    Ta27Tokenizer_Free(tok);

    return n;
}

static void
initerr(perrdetail *err_ret, const char *filename)
{
  initerr_object(err_ret, PyUnicode_FromString(filename));
}

static int
initerr_object(perrdetail *err_ret, PyObject *filename)
{
    err_ret->error = E_OK;
    err_ret->lineno = 0;
    err_ret->offset = 0;
    err_ret->text = NULL;
    err_ret->token = -1;
    err_ret->expected = -1;
#ifndef PGEN
    if (filename) {
        Py_INCREF(filename);
        err_ret->filename = filename;
    }
    else {
        err_ret->filename = PyUnicode_FromString("<string>");
        if (err_ret->filename == NULL) {
            err_ret->error = E_ERROR;
            return -1;
        }
    }
#endif
    return 0;
}


================================================
FILE: ast27/Parser/spark.py
================================================
#  Copyright (c) 1998-2002 John Aycock
#
#  Permission is hereby granted, free of charge, to any person obtaining
#  a copy of this software and associated documentation files (the
#  "Software"), to deal in the Software without restriction, including
#  without limitation the rights to use, copy, modify, merge, publish,
#  distribute, sublicense, and/or sell copies of the Software, and to
#  permit persons to whom the Software is furnished to do so, subject to
#  the following conditions:
#
#  The above copyright notice and this permission notice shall be
#  included in all copies or substantial portions of the Software.
#
#  THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
#  EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
#  MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
#  IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
#  CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
#  TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
#  SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

__version__ = 'SPARK-0.7 (pre-alpha-5)'

import re
import string

def _namelist(instance):
    namelist, namedict, classlist = [], {}, [instance.__class__]
    for c in classlist:
        for b in c.__bases__:
            classlist.append(b)
        for name in c.__dict__.keys():
            if not namedict.has_key(name):
                namelist.append(name)
                namedict[name] = 1
    return namelist

class GenericScanner:
    def __init__(self, flags=0):
        pattern = self.reflect()
        self.re = re.compile(pattern, re.VERBOSE|flags)

        self.index2func = {}
        for name, number in self.re.groupindex.items():
            self.index2func[number-1] = getattr(self, 't_' + name)

    def makeRE(self, name):
        doc = getattr(self, name).__doc__
        rv = '(?P<%s>%s)' % (name[2:], doc)
        return rv

    def reflect(self):
        rv = []
        for name in _namelist(self):
            if name[:2] == 't_' and name != 't_default':
                rv.append(self.makeRE(name))

        rv.append(self.makeRE('t_default'))
        return string.join(rv, '|')

    def error(self, s, pos):
        print "Lexical error at position %s" % pos
        raise SystemExit

    def tokenize(self, s):
        pos = 0
        n = len(s)
        while pos < n:
            m = self.re.match(s, pos)
            if m is None:
                self.error(s, pos)

            groups = m.groups()
            for i in range(len(groups)):
                if groups[i] and self.index2func.has_key(i):
                    self.index2func[i](groups[i])
            pos = m.end()

    def t_default(self, s):
        r'( . | \n )+'
        print "Specification error: unmatched input"
        raise SystemExit

#
#  Extracted from GenericParser and made global so that [un]picking works.
#
class _State:
    def __init__(self, stateno, items):
        self.T, self.complete, self.items = [], [], items
        self.stateno = stateno

class GenericParser:
    #
    #  An Earley parser, as per J. Earley, "An Efficient Context-Free
    #  Parsing Algorithm", CACM 13(2), pp. 94-102.  Also J. C. Earley,
    #  "An Efficient Context-Free Parsing Algorithm", Ph.D. thesis,
    #  Carnegie-Mellon University, August 1968.  New formulation of
    #  the parser according to J. Aycock, "Practical Earley Parsing
    #  and the SPARK Toolkit", Ph.D. thesis, University of Victoria,
    #  2001, and J. Aycock and R. N. Horspool, "Practical Earley
    #  Parsing", unpublished paper, 2001.
    #

    def __init__(self, start):
        self.rules = {}
        self.rule2func = {}
        self.rule2name = {}
        self.collectRules()
        self.augment(start)
        self.ruleschanged = 1

    _NULLABLE = '\e_'
    _START = 'START'
    _BOF = '|-'

    #
    #  When pickling, take the time to generate the full state machine;
    #  some information is then extraneous, too.  Unfortunately we
    #  can't save the rule2func map.
    #
    def __getstate__(self):
        if self.ruleschanged:
            #
            #  XXX - duplicated from parse()
            #
            self.computeNull()
            self.newrules = {}
            self.new2old = {}
            self.makeNewRules()
            self.ruleschanged = 0
            self.edges, self.cores = {}, {}
            self.states = { 0: self.makeState0() }
            self.makeState(0, self._BOF)
        #
        #  XXX - should find a better way to do this..
        #
        changes = 1
        while changes:
            changes = 0
            for k, v in self.edges.items():
                if v is None:
                    state, sym = k
                    if self.states.has_key(state):
                        self.goto(state, sym)
                        changes = 1
        rv = self.__dict__.copy()
        for s in self.states.values():
            del s.items
        del rv['rule2func']
        del rv['nullable']
        del rv['cores']
        return rv

    def __setstate__(self, D):
        self.rules = {}
        self.rule2func = {}
        self.rule2name = {}
        self.collectRules()
        start = D['rules'][self._START][0][1][1]        # Blech.
        self.augment(start)
        D['rule2func'] = self.rule2func
        D['makeSet'] = self.makeSet_fast
        self.__dict__ = D

    #
    #  A hook for GenericASTBuilder and GenericASTMatcher.  Mess
    #  thee not with this; nor shall thee toucheth the _preprocess
    #  argument to addRule.
    #
    def preprocess(self, rule, func):       return rule, func

    def addRule(self, doc, func, _preprocess=1):
        fn = func
        rules = string.split(doc)

        index = []
        for i in range(len(rules)):
            if rules[i] == '::=':
                index.append(i-1)
        index.append(len(rules))

        for i in range(len(index)-1):
            lhs = rules[index[i]]
            rhs = rules[index[i]+2:index[i+1]]
            rule = (lhs, tuple(rhs))

            if _preprocess:
                rule, fn = self.preprocess(rule, func)

            if self.rules.has_key(lhs):
                self.rules[lhs].append(rule)
            else:
                self.rules[lhs] = [ rule ]
            self.rule2func[rule] = fn
            self.rule2name[rule] = func.__name__[2:]
        self.ruleschanged = 1

    def collectRules(self):
        for name in _namelist(self):
            if name[:2] == 'p_':
                func = getattr(self, name)
                doc = func.__doc__
                self.addRule(doc, func)

    def augment(self, start):
        rule = '%s ::= %s %s' % (self._START, self._BOF, start)
        self.addRule(rule, lambda args: args[1], 0)

    def computeNull(self):
        self.nullable = {}
        tbd = []

        for rulelist in self.rules.values():
            lhs = rulelist[0][0]
            self.nullable[lhs] = 0
            for rule in rulelist:
                rhs = rule[1]
                if len(rhs) == 0:
                    self.nullable[lhs] = 1
                    continue
                #
                #  We only need to consider rules which
                #  consist entirely of nonterminal symbols.
                #  This should be a savings on typical
                #  grammars.
                #
                for sym in rhs:
                    if not self.rules.has_key(sym):
                        break
                else:
                    tbd.append(rule)
        changes = 1
        while changes:
            changes = 0
            for lhs, rhs in tbd:
                if self.nullable[lhs]:
                    continue
                for sym in rhs:
                    if not self.nullable[sym]:
                        break
                else:
                    self.nullable[lhs] = 1
                    changes = 1

    def makeState0(self):
        s0 = _State(0, [])
        for rule in self.newrules[self._START]:
            s0.items.append((rule, 0))
        return s0

    def finalState(self, tokens):
        #
        #  Yuck.
        #
        if len(self.newrules[self._START]) == 2 and len(tokens) == 0:
            return 1
        start = self.rules[self._START][0][1][1]
        return self.goto(1, start)

    def makeNewRules(self):
        worklist = []
        for rulelist in self.rules.values():
            for rule in rulelist:
                worklist.append((rule, 0, 1, rule))

        for rule, i, candidate, oldrule in worklist:
            lhs, rhs = rule
            n = len(rhs)
            while i < n:
                sym = rhs[i]
                if not self.rules.has_key(sym) or \
                   not self.nullable[sym]:
                    candidate = 0
                    i = i + 1
                    continue

                newrhs = list(rhs)
                newrhs[i] = self._NULLABLE+sym
                newrule = (lhs, tuple(newrhs))
                worklist.append((newrule, i+1,
                                 candidate, oldrule))
                candidate = 0
                i = i + 1
            else:
                if candidate:
                    lhs = self._NULLABLE+lhs
                    rule = (lhs, rhs)
                if self.newrules.has_key(lhs):
                    self.newrules[lhs].append(rule)
                else:
                    self.newrules[lhs] = [ rule ]
                self.new2old[rule] = oldrule

    def typestring(self, token):
        return None

    def error(self, token):
        print "Syntax error at or near `%s' token" % token
        raise SystemExit

    def parse(self, tokens):
        sets = [ [(1,0), (2,0)] ]
        self.links = {}

        if self.ruleschanged:
            self.computeNull()
            self.newrules = {}
            self.new2old = {}
            self.makeNewRules()
            self.ruleschanged = 0
            self.edges, self.cores = {}, {}
            self.states = { 0: self.makeState0() }
            self.makeState(0, self._BOF)

        for i in xrange(len(tokens)):
            sets.append([])

            if sets[i] == []:
                break
            self.makeSet(tokens[i], sets, i)
        else:
            sets.append([])
            self.makeSet(None, sets, len(tokens))

        #_dump(tokens, sets, self.states)

        finalitem = (self.finalState(tokens), 0)
        if finalitem not in sets[-2]:
            if len(tokens) > 0:
                self.error(tokens[i-1])
            else:
                self.error(None)

        return self.buildTree(self._START, finalitem,
                              tokens, len(sets)-2)

    def isnullable(self, sym):
        #
        #  For symbols in G_e only.  If we weren't supporting 1.5,
        #  could just use sym.startswith().
        #
        return self._NULLABLE == sym[0:len(self._NULLABLE)]

    def skip(self, (lhs, rhs), pos=0):
        n = len(rhs)
        while pos < n:
            if not self.isnullable(rhs[pos]):
                break
            pos = pos + 1
        return pos

    def makeState(self, state, sym):
        assert sym is not None
        #
        #  Compute \epsilon-kernel state's core and see if
        #  it exists already.
        #
        kitems = []
        for rule, pos in self.states[state].items:
            lhs, rhs = rule
            if rhs[pos:pos+1] == (sym,):
                kitems.append((rule, self.skip(rule, pos+1)))
        core = kitems

        core.sort()
        tcore = tuple(core)
        if self.cores.has_key(tcore):
            return self.cores[tcore]
        #
        #  Nope, doesn't exist.  Compute it and the associated
        #  \epsilon-nonkernel state together; we'll need it right away.
        #
        k = self.cores[tcore] = len(self.states)
        K, NK = _State(k, kitems), _State(k+1, [])
        self.states[k] = K
        predicted = {}

        edges = self.edges
        rules = self.newrules
        for X in K, NK:
            worklist = X.items
            for item in worklist:
                rule, pos = item
                lhs, rhs = rule
                if pos == len(rhs):
                    X.complete.append(rule)
                    continue

                nextSym = rhs[pos]
                key = (X.stateno, nextSym)
                if not rules.has_key(nextSym):
                    if not edges.has_key(key):
                        edges[key] = None
                        X.T.append(nextSym)
                else:
                    edges[key] = None
                    if not predicted.has_key(nextSym):
                        predicted[nextSym] = 1
                        for prule in rules[nextSym]:
                            ppos = self.skip(prule)
                            new = (prule, ppos)
                            NK.items.append(new)
            #
            #  Problem: we know K needs generating, but we
            #  don't yet know about NK.  Can't commit anything
            #  regarding NK to self.edges until we're sure.  Should
            #  we delay committing on both K and NK to avoid this
            #  hacky code?  This creates other problems..
            #
            if X is K:
                edges = {}

        if NK.items == []:
            return k

        #
        #  Check for \epsilon-nonkernel's core.  Unfortunately we
        #  need to know the entire set of predicted nonterminals
        #  to do this without accidentally duplicating states.
        #
        core = predicted.keys()
        core.sort()
        tcore = tuple(core)
        if self.cores.has_key(tcore):
            self.edges[(k, None)] = self.cores[tcore]
            return k

        nk = self.cores[tcore] = self.edges[(k, None)] = NK.stateno
        self.edges.update(edges)
        self.states[nk] = NK
        return k

    def goto(self, state, sym):
        key = (state, sym)
        if not self.edges.has_key(key):
            #
            #  No transitions from state on sym.
            #
            return None

        rv = self.edges[key]
        if rv is None:
            #
            #  Target state isn't generated yet.  Remedy this.
            #
            rv = self.makeState(state, sym)
            self.edges[key] = rv
        return rv

    def gotoT(self, state, t):
        return [self.goto(state, t)]

    def gotoST(self, state, st):
        rv = []
        for t in self.states[state].T:
            if st == t:
                rv.append(self.goto(state, t))
        return rv

    def add(self, set, item, i=None, predecessor=None, causal=None):
        if predecessor is None:
            if item not in set:
                set.append(item)
        else:
            key = (item, i)
            if item not in set:
                self.links[key] = []
                set.append(item)
            self.links[key].append((predecessor, causal))

    def makeSet(self, token, sets, i):
        cur, next = sets[i], sets[i+1]

        ttype = token is not None and self.typestring(token) or None
        if ttype is not None:
            fn, arg = self.gotoT, ttype
        else:
            fn, arg = self.gotoST, token

        for item in cur:
            ptr = (item, i)
            state, parent = item
            add = fn(state, arg)
            for k in add:
                if k is not None:
                    self.add(next, (k, parent), i+1, ptr)
                    nk = self.goto(k, None)
                    if nk is not None:
                        self.add(next, (nk, i+1))

            if parent == i:
                continue

            for rule in self.states[state].complete:
                lhs, rhs = rule
                for pitem in sets[parent]:
                    pstate, pparent = pitem
                    k = self.goto(pstate, lhs)
                    if k is not None:
                        why = (item, i, rule)
                        pptr = (pitem, parent)
                        self.add(cur, (k, pparent),
                                 i, pptr, why)
                        nk = self.goto(k, None)
                        if nk is not None:
                            self.add(cur, (nk, i))

    def makeSet_fast(self, token, sets, i):
        #
        #  Call *only* when the entire state machine has been built!
        #  It relies on self.edges being filled in completely, and
        #  then duplicates and inlines code to boost speed at the
        #  cost of extreme ugliness.
        #
        cur, next = sets[i], sets[i+1]
        ttype = token is not None and self.typestring(token) or None

        for item in cur:
            ptr = (item, i)
            state, parent = item
            if ttype is not None:
                k = self.edges.get((state, ttype), None)
                if k is not None:
                    #self.add(next, (k, parent), i+1, ptr)
                    #INLINED --v
                    new = (k, parent)
                    key = (new, i+1)
                    if new not in next:
                        self.links[key] = []
                        next.append(new)
                    self.links[key].append((ptr, None))
                    #INLINED --^
                    #nk = self.goto(k, None)
                    nk = self.edges.get((k, None), None)
                    if nk is not None:
                        #self.add(next, (nk, i+1))
                        #INLINED --v
                        new = (nk, i+1)
                        if new not in next:
                            next.append(new)
                        #INLINED --^
            else:
                add = self.gotoST(state, token)
                for k in add:
                    if k is not None:
                        self.add(next, (k, parent), i+1, ptr)
                        #nk = self.goto(k, None)
                        nk = self.edges.get((k, None), None)
                        if nk is not None:
                            self.add(next, (nk, i+1))

            if parent == i:
                continue

            for rule in self.states[state].complete:
                lhs, rhs = rule
                for pitem in sets[parent]:
                    pstate, pparent = pitem
                    #k = self.goto(pstate, lhs)
                    k = self.edges.get((pstate, lhs), None)
                    if k is not None:
                        why = (item, i, rule)
                        pptr = (pitem, parent)
                        #self.add(cur, (k, pparent),
                        #        i, pptr, why)
                        #INLINED --v
                        new = (k, pparent)
                        key = (new, i)
                        if new not in cur:
                            self.links[key] = []
                            cur.append(new)
                        self.links[key].append((pptr, why))
                        #INLINED --^
                        #nk = self.goto(k, None)
                        nk = self.edges.get((k, None), None)
                        if nk is not None:
                            #self.add(cur, (nk, i))
                            #INLINED --v
                            new = (nk, i)
                            if new not in cur:
                                cur.append(new)
             
Download .txt
gitextract_xd1s015c/

├── .gitattributes
├── .github/
│   └── workflows/
│       └── build.yml
├── .gitignore
├── CONTRIBUTING.md
├── LICENSE
├── MANIFEST.in
├── README.md
├── ast27/
│   ├── Custom/
│   │   └── typed_ast.c
│   ├── Grammar/
│   │   └── Grammar
│   ├── Include/
│   │   ├── Python-ast.h
│   │   ├── asdl.h
│   │   ├── ast.h
│   │   ├── bitset.h
│   │   ├── compile.h
│   │   ├── errcode.h
│   │   ├── graminit.h
│   │   ├── grammar.h
│   │   ├── node.h
│   │   ├── parsetok.h
│   │   ├── pgenheaders.h
│   │   ├── pyarena.h
│   │   ├── pycore_pyarena.h
│   │   └── token.h
│   ├── Parser/
│   │   ├── Python.asdl
│   │   ├── acceler.c
│   │   ├── asdl.py
│   │   ├── asdl_c.py
│   │   ├── bitset.c
│   │   ├── grammar.c
│   │   ├── grammar1.c
│   │   ├── node.c
│   │   ├── parser.c
│   │   ├── parser.h
│   │   ├── parsetok.c
│   │   ├── spark.py
│   │   ├── tokenizer.c
│   │   └── tokenizer.h
│   └── Python/
│       ├── Python-ast.c
│       ├── asdl.c
│       ├── ast.c
│       ├── graminit.c
│       └── mystrtoul.c
├── ast3/
│   ├── Custom/
│   │   └── typed_ast.c
│   ├── Grammar/
│   │   └── Grammar
│   ├── Include/
│   │   ├── Python-ast.h
│   │   ├── asdl.h
│   │   ├── ast.h
│   │   ├── bitset.h
│   │   ├── errcode.h
│   │   ├── graminit.h
│   │   ├── grammar.h
│   │   ├── node.h
│   │   ├── parsetok.h
│   │   ├── pgenheaders.h
│   │   ├── pyarena.h
│   │   ├── pycore_pyarena.h
│   │   └── token.h
│   ├── Parser/
│   │   ├── Python.asdl
│   │   ├── acceler.c
│   │   ├── asdl.py
│   │   ├── asdl_c.py
│   │   ├── bitset.c
│   │   ├── grammar.c
│   │   ├── grammar1.c
│   │   ├── node.c
│   │   ├── parser.c
│   │   ├── parser.h
│   │   ├── parsetok.c
│   │   ├── tokenizer.c
│   │   └── tokenizer.h
│   ├── Python/
│   │   ├── Python-ast.c
│   │   ├── asdl.c
│   │   ├── ast.c
│   │   └── graminit.c
│   └── tests/
│       └── test_basics.py
├── release_process.md
├── setup.py
├── tools/
│   ├── Grammar.patch
│   ├── Python-asdl.patch
│   ├── asdl_c.patch
│   ├── ast.patch
│   ├── find_exported_symbols
│   ├── parsetok.patch
│   ├── script
│   ├── token.patch
│   ├── tokenizer.patch
│   ├── update_ast27_asdl
│   ├── update_ast3_asdl
│   ├── update_ast3_grammar
│   ├── update_exported_symbols
│   └── update_header_guards
└── typed_ast/
    ├── __init__.py
    ├── ast27.py
    ├── ast3.py
    └── conversions.py
Download .txt
SYMBOL INDEX (1243 symbols across 53 files)

FILE: ast27/Custom/typed_ast.c
  function PARSER_FLAGS (line 66) | static int PARSER_FLAGS(PyCompilerFlags *flags)
  function err_input (line 80) | static void
  function err_free (line 200) | static void
  function mod_ty (line 208) | static mod_ty
  function PyObject (line 238) | static PyObject *
  function PyObject (line 260) | static PyObject *
  function PyObject (line 303) | PyObject *

FILE: ast27/Include/Python-ast.h
  type _mod (line 5) | struct _mod
  type _stmt (line 7) | struct _stmt
  type _expr (line 9) | struct _expr
  type expr_context_ty (line 11) | typedef enum _expr_context { Load=1, Store=2, Del=3, AugLoad=4, AugStore...
  type _slice (line 14) | struct _slice
  type boolop_ty (line 16) | typedef enum _boolop { And=1, Or=2 } boolop_ty;
  type operator_ty (line 18) | typedef enum _operator { Add=1, Sub=2, Mult=3, Div=4, Mod=5, Pow=6, LShi...
  type unaryop_ty (line 21) | typedef enum _unaryop { Invert=1, Not=2, UAdd=3, USub=4 } unaryop_ty;
  type cmpop_ty (line 23) | typedef enum _cmpop { Eq=1, NotEq=2, Lt=3, LtE=4, Gt=5, GtE=6, Is=7, IsN...
  type _comprehension (line 26) | struct _comprehension
  type _excepthandler (line 28) | struct _excepthandler
  type _arguments (line 30) | struct _arguments
  type _keyword (line 32) | struct _keyword
  type _alias (line 34) | struct _alias
  type _type_ignore (line 36) | struct _type_ignore
  type _mod_kind (line 39) | enum _mod_kind {Module_kind=1, Interactive_kind=2, Expression_kind=3, Fu...
  type _mod (line 41) | struct _mod {
  type _stmt_kind (line 69) | enum _stmt_kind {FunctionDef_kind=1, ClassDef_kind=2, Return_kind=3, Del...
  type _stmt (line 74) | struct _stmt {
  type _expr_kind (line 196) | enum _expr_kind {BoolOp_kind=1, BinOp_kind=2, UnaryOp_kind=3, Lambda_kin...
  type _expr (line 201) | struct _expr {
  type _slice_kind (line 324) | enum _slice_kind {Ellipsis_kind=1, Slice_kind=2, ExtSlice_kind=3, Index_...
  type _slice (line 325) | struct _slice {
  type _comprehension (line 345) | struct _comprehension {
  type _excepthandler_kind (line 351) | enum _excepthandler_kind {ExceptHandler_kind=1}
  type _excepthandler (line 352) | struct _excepthandler {
  type _arguments (line 366) | struct _arguments {
  type _keyword (line 374) | struct _keyword {
  type _alias (line 379) | struct _alias {
  type _type_ignore_kind (line 384) | enum _type_ignore_kind {TypeIgnore_kind=1}
  type _type_ignore (line 385) | struct _type_ignore {

FILE: ast27/Include/asdl.h
  type PyObject (line 6) | typedef PyObject * identifier;
  type PyObject (line 7) | typedef PyObject * string;
  type PyObject (line 8) | typedef PyObject * object;
  type asdl_seq (line 24) | typedef struct {
  type asdl_int_seq (line 29) | typedef struct {

FILE: ast27/Include/bitset.h
  type BYTE (line 12) | typedef BYTE *bitset;

FILE: ast27/Include/compile.h
  type _mod (line 12) | struct _mod

FILE: ast27/Include/grammar.h
  type label (line 14) | typedef struct {
  type labellist (line 23) | typedef struct {
  type arc (line 30) | typedef struct {
  type state (line 37) | typedef struct {
  type dfa (line 50) | typedef struct {
  type grammar (line 61) | typedef struct {

FILE: ast27/Include/node.h
  type node (line 10) | typedef struct _node {

FILE: ast27/Include/parsetok.h
  type perrdetail (line 10) | typedef struct {

FILE: ast27/Include/pycore_pyarena.h
  type PyArena (line 10) | typedef struct _arena PyArena;

FILE: ast27/Parser/acceler.c
  function Ta27Grammar_AddAccelerators (line 23) | void
  function Ta27Grammar_RemoveAccelerators (line 34) | void
  function fixdfa (line 53) | static void
  function fixstate (line 63) | static void

FILE: ast27/Parser/asdl.py
  class Token (line 18) | class Token(object):
    method __init__ (line 21) | def __init__(self, type, lineno):
    method __str__ (line 25) | def __str__(self):
    method __repr__ (line 28) | def __repr__(self):
  class Id (line 31) | class Id(Token):
    method __init__ (line 32) | def __init__(self, value, lineno):
    method __str__ (line 37) | def __str__(self):
  class String (line 40) | class String(Token):
    method __init__ (line 41) | def __init__(self, value, lineno):
  class ASDLSyntaxError (line 46) | class ASDLSyntaxError(Exception):
    method __init__ (line 48) | def __init__(self, lineno, token=None, msg=None):
    method __str__ (line 53) | def __str__(self):
  class ASDLScanner (line 59) | class ASDLScanner(spark.GenericScanner, object):
    method tokenize (line 61) | def tokenize(self, input):
    method t_id (line 67) | def t_id(self, s):
    method t_string (line 73) | def t_string(self, s):
    method t_xxx (line 77) | def t_xxx(self, s): # not sure what this production means
    method t_punctuation (line 81) | def t_punctuation(self, s):
    method t_comment (line 85) | def t_comment(self, s):
    method t_newline (line 89) | def t_newline(self, s):
    method t_whitespace (line 93) | def t_whitespace(self, s):
    method t_default (line 97) | def t_default(self, s):
  class ASDLParser (line 101) | class ASDLParser(spark.GenericParser, object):
    method __init__ (line 102) | def __init__(self):
    method typestring (line 105) | def typestring(self, tok):
    method error (line 108) | def error(self, tok):
    method p_module_0 (line 111) | def p_module_0(self, (module, name, version, _0, _1)):
    method p_module (line 118) | def p_module(self, (module, name, version, _0, definitions, _1)):
    method p_version (line 125) | def p_version(self, (version, V)):
    method p_definition_0 (line 132) | def p_definition_0(self, (definition,)):
    method p_definition_1 (line 136) | def p_definition_1(self, (definitions, definition)):
    method p_definition (line 140) | def p_definition(self, (id, _, type)):
    method p_type_0 (line 144) | def p_type_0(self, (product,)):
    method p_type_1 (line 148) | def p_type_1(self, (sum,)):
    method p_type_2 (line 152) | def p_type_2(self, (sum, id, _0, attributes, _1)):
    method p_product (line 161) | def p_product(self, (_0, fields, _1)):
    method p_sum_0 (line 167) | def p_sum_0(self, (constructor,)):
    method p_sum_1 (line 171) | def p_sum_1(self, (constructor, _, sum)):
    method p_sum_2 (line 175) | def p_sum_2(self, (constructor, _, sum)):
    method p_constructor_0 (line 179) | def p_constructor_0(self, (id,)):
    method p_constructor_1 (line 183) | def p_constructor_1(self, (id, _0, fields, _1)):
    method p_fields_0 (line 189) | def p_fields_0(self, (field,)):
    method p_fields_1 (line 193) | def p_fields_1(self, (field, _, fields)):
    method p_field_0 (line 197) | def p_field_0(self, (type,)):
    method p_field_1 (line 201) | def p_field_1(self, (type, name)):
    method p_field_2 (line 205) | def p_field_2(self, (type, _, name)):
    method p_field_3 (line 209) | def p_field_3(self, (type, _, name)):
    method p_field_4 (line 213) | def p_field_4(self, (type, _)):
    method p_field_5 (line 217) | def p_field_5(self, (type, _)):
  class AST (line 227) | class AST(object):
  class Module (line 230) | class Module(AST):
    method __init__ (line 231) | def __init__(self, name, dfns, version):
    method __repr__ (line 239) | def __repr__(self):
  class Type (line 242) | class Type(AST):
    method __init__ (line 243) | def __init__(self, name, value):
    method __repr__ (line 247) | def __repr__(self):
  class Constructor (line 250) | class Constructor(AST):
    method __init__ (line 251) | def __init__(self, name, fields=None):
    method __repr__ (line 255) | def __repr__(self):
  class Field (line 258) | class Field(AST):
    method __init__ (line 259) | def __init__(self, type, name=None, seq=False, opt=False):
    method __repr__ (line 265) | def __repr__(self):
  class Sum (line 277) | class Sum(AST):
    method __init__ (line 278) | def __init__(self, types, attributes=None):
    method __repr__ (line 282) | def __repr__(self):
  class Product (line 288) | class Product(AST):
    method __init__ (line 289) | def __init__(self, fields):
    method __repr__ (line 292) | def __repr__(self):
  class VisitorBase (line 295) | class VisitorBase(object):
    method __init__ (line 297) | def __init__(self, skip=False):
    method visit (line 301) | def visit(self, object, *args):
    method _dispatch (line 316) | def _dispatch(self, object):
  class Check (line 329) | class Check(VisitorBase):
    method __init__ (line 331) | def __init__(self):
    method visitModule (line 337) | def visitModule(self, mod):
    method visitType (line 341) | def visitType(self, type):
    method visitSum (line 344) | def visitSum(self, sum, name):
    method visitConstructor (line 348) | def visitConstructor(self, cons, name):
    method visitField (line 360) | def visitField(self, field, name):
    method visitProduct (line 365) | def visitProduct(self, prod, name):
  function check (line 369) | def check(mod):
  function parse (line 381) | def parse(file):

FILE: ast27/Parser/asdl_c.py
  function get_c_type (line 14) | def get_c_type(name):
  function reflow_lines (line 28) | def reflow_lines(s, depth):
  function is_simple (line 70) | def is_simple(sum):
  class EmitVisitor (line 82) | class EmitVisitor(asdl.VisitorBase):
    method __init__ (line 85) | def __init__(self, file):
    method emit (line 89) | def emit(self, s, depth, reflow=True):
  class TypeDefVisitor (line 100) | class TypeDefVisitor(EmitVisitor):
    method visitModule (line 101) | def visitModule(self, mod):
    method visitType (line 105) | def visitType(self, type, depth=0):
    method visitSum (line 108) | def visitSum(self, sum, name, depth):
    method simple_sum (line 114) | def simple_sum(self, sum, name, depth):
    method sum_with_constructors (line 125) | def sum_with_constructors(self, sum, name, depth):
    method visitProduct (line 131) | def visitProduct(self, product, name, depth):
  class StructVisitor (line 138) | class StructVisitor(EmitVisitor):
    method visitModule (line 141) | def visitModule(self, mod):
    method visitType (line 145) | def visitType(self, type, depth=0):
    method visitSum (line 148) | def visitSum(self, sum, name, depth):
    method sum_with_constructors (line 152) | def sum_with_constructors(self, sum, name, depth):
    method visitConstructor (line 176) | def visitConstructor(self, cons, depth):
    method visitField (line 187) | def visitField(self, field, depth):
    method visitProduct (line 200) | def visitProduct(self, product, name, depth):
  class PrototypeVisitor (line 208) | class PrototypeVisitor(EmitVisitor):
    method visitModule (line 211) | def visitModule(self, mod):
    method visitType (line 215) | def visitType(self, type):
    method visitSum (line 218) | def visitSum(self, sum, name):
    method get_args (line 225) | def get_args(self, fields):
    method visitConstructor (line 252) | def visitConstructor(self, cons, type, attrs):
    method emit_function (line 258) | def emit_function(self, name, ctype, args, attrs, union=True):
    method visitProduct (line 273) | def visitProduct(self, prod, name):
  class FunctionVisitor (line 278) | class FunctionVisitor(PrototypeVisitor):
    method emit_function (line 281) | def emit_function(self, name, ctype, args, attrs, union=True):
    method emit_body_union (line 316) | def emit_body_union(self, name, args, attrs):
    method emit_body_struct (line 325) | def emit_body_struct(self, name, args, attrs):
  class PickleVisitor (line 333) | class PickleVisitor(EmitVisitor):
    method visitModule (line 335) | def visitModule(self, mod):
    method visitType (line 339) | def visitType(self, type):
    method visitSum (line 342) | def visitSum(self, sum, name):
    method visitProduct (line 345) | def visitProduct(self, sum, name):
    method visitConstructor (line 348) | def visitConstructor(self, cons, name):
    method visitField (line 351) | def visitField(self, sum):
  class Obj2ModPrototypeVisitor (line 355) | class Obj2ModPrototypeVisitor(PickleVisitor):
    method visitProduct (line 356) | def visitProduct(self, prod, name):
  class Obj2ModVisitor (line 363) | class Obj2ModVisitor(PickleVisitor):
    method funcHeader (line 364) | def funcHeader(self, name):
    method sumTrailer (line 373) | def sumTrailer(self, name):
    method simpleSum (line 387) | def simpleSum(self, sum, name):
    method buildArgs (line 402) | def buildArgs(self, fields):
    method complexSum (line 405) | def complexSum(self, sum, name):
    method visitAttributeDeclaration (line 436) | def visitAttributeDeclaration(self, a, name, sum=sum):
    method visitSum (line 440) | def visitSum(self, sum, name):
    method visitProduct (line 446) | def visitProduct(self, prod, name):
    method visitFieldDeclaration (line 466) | def visitFieldDeclaration(self, field, name, sum=None, prod=None, dept...
    method isSimpleSum (line 477) | def isSimpleSum(self, field):
    method isNumeric (line 482) | def isNumeric(self, field):
    method isSimpleType (line 485) | def isSimpleType(self, field):
    method visitField (line 488) | def visitField(self, field, name, sum=None, prod=None, depth=0):
  class MarshalPrototypeVisitor (line 541) | class MarshalPrototypeVisitor(PickleVisitor):
    method prototype (line 543) | def prototype(self, sum, name):
  class PyTypesDeclareVisitor (line 551) | class PyTypesDeclareVisitor(PickleVisitor):
    method visitProduct (line 553) | def visitProduct(self, prod, name):
    method visitSum (line 562) | def visitSum(self, sum, name):
    method visitConstructor (line 581) | def visitConstructor(self, cons, name):
  class PyTypesVisitor (line 589) | class PyTypesVisitor(PickleVisitor):
    method visitModule (line 591) | def visitModule(self, mod):
    method visitProduct (line 887) | def visitProduct(self, prod, name):
    method visitSum (line 896) | def visitSum(self, sum, name):
    method visitConstructor (line 909) | def visitConstructor(self, cons, name, simple):
  class ASTModuleVisitor (line 923) | class ASTModuleVisitor(PickleVisitor):
    method visitModule (line 925) | def visitModule(self, mod):
    method visitProduct (line 951) | def visitProduct(self, prod, name):
    method visitSum (line 954) | def visitSum(self, sum, name):
    method visitConstructor (line 959) | def visitConstructor(self, cons, name):
    method addObj (line 962) | def addObj(self, name):
  function find_sequence (line 968) | def find_sequence(fields, doing_specialization):
  function has_sequence (line 978) | def has_sequence(types, doing_specialization):
  class StaticVisitor (line 985) | class StaticVisitor(PickleVisitor):
    method visit (line 988) | def visit(self, object):
  class ObjVisitor (line 992) | class ObjVisitor(PickleVisitor):
    method func_begin (line 994) | def func_begin(self, name):
    method func_end (line 1007) | def func_end(self):
    method visitSum (line 1016) | def visitSum(self, sum, name):
    method simpleSum (line 1034) | def simpleSum(self, sum, name):
    method visitProduct (line 1050) | def visitProduct(self, prod, name):
    method visitConstructor (line 1058) | def visitConstructor(self, cons, enum, name):
    method visitField (line 1066) | def visitField(self, field, name, depth, product):
    method emitSeq (line 1079) | def emitSeq(self, field, value, depth, emit):
    method set (line 1091) | def set(self, field, value, depth):
  class PartingShots (line 1113) | class PartingShots(StaticVisitor):
  class ChainOfVisitors (line 1163) | class ChainOfVisitors:
    method __init__ (line 1164) | def __init__(self, *visitors):
    method visit (line 1167) | def visit(self, object):
  function main (line 1185) | def main(srcfile):

FILE: ast27/Parser/bitset.c
  function bitset (line 7) | bitset
  function delbitset (line 22) | void
  function addbit (line 28) | int
  function testbit (line 41) | int
  function samebitset (line 48) | int
  function mergebitset (line 59) | void

FILE: ast27/Parser/grammar.c
  function grammar (line 18) | grammar *
  function dfa (line 35) | dfa *
  function addstate (line 54) | int
  function addarc (line 73) | void
  function addlabel (line 91) | int
  function findlabel (line 117) | int
  function translatelabels (line 135) | void
  function translabel (line 148) | static void

FILE: ast27/Parser/grammar1.c
  function dfa (line 11) | dfa *

FILE: ast27/Parser/node.c
  function node (line 7) | node *
  function fancy_roundup (line 22) | static int
  function Ta27Node_AddChild (line 78) | int
  function Ta27Node_Free (line 120) | void
  function Py_ssize_t (line 129) | Py_ssize_t
  function freechildren (line 139) | static void
  function Py_ssize_t (line 151) | static Py_ssize_t

FILE: ast27/Parser/parser.c
  function s_reset (line 29) | static void
  function s_push (line 37) | static int
  function s_pop (line 54) | static void
  function parser_state (line 71) | parser_state *
  function Ta27Parser_Delete (line 95) | void
  function shift (line 107) | static int
  function push (line 119) | static int
  function classify (line 136) | static int
  function future_hack (line 178) | static void
  function Ta27Parser_AddToken (line 219) | int
  function dumptree (line 329) | void
  function showtree (line 353) | void

FILE: ast27/Parser/parser.h
  type stackentry (line 12) | typedef struct {
  type stack (line 18) | typedef struct {
  type parser_state (line 24) | typedef struct {

FILE: ast27/Parser/parsetok.c
  type tok_state (line 17) | struct tok_state
  function node (line 22) | node *
  function node (line 28) | node *
  function node (line 36) | node *
  function node (line 46) | node *
  function node (line 70) | node *
  function node (line 99) | node *
  function node (line 107) | node *
  function node (line 115) | node *
  function warn (line 144) | static void
  type growable_comment_array (line 154) | typedef struct {
  function growable_comment_array_init (line 163) | static int
  function growable_comment_array_add (line 173) | static int
  function growable_comment_array_deallocate (line 189) | static void
  function node (line 202) | static node *
  function initerr (line 386) | static void
  function initerr_object (line 392) | static int

FILE: ast27/Parser/spark.py
  function _namelist (line 27) | def _namelist(instance):
  class GenericScanner (line 38) | class GenericScanner:
    method __init__ (line 39) | def __init__(self, flags=0):
    method makeRE (line 47) | def makeRE(self, name):
    method reflect (line 52) | def reflect(self):
    method error (line 61) | def error(self, s, pos):
    method tokenize (line 65) | def tokenize(self, s):
    method t_default (line 79) | def t_default(self, s):
  class _State (line 87) | class _State:
    method __init__ (line 88) | def __init__(self, stateno, items):
  class GenericParser (line 92) | class GenericParser:
    method __init__ (line 104) | def __init__(self, start):
    method __getstate__ (line 121) | def __getstate__(self):
    method __setstate__ (line 154) | def __setstate__(self, D):
    method preprocess (line 170) | def preprocess(self, rule, func):       return rule, func
    method addRule (line 172) | def addRule(self, doc, func, _preprocess=1):
    method collectRules (line 198) | def collectRules(self):
    method augment (line 205) | def augment(self, start):
    method computeNull (line 209) | def computeNull(self):
    method makeState0 (line 245) | def makeState0(self):
    method finalState (line 251) | def finalState(self, tokens):
    method makeNewRules (line 260) | def makeNewRules(self):
    method typestring (line 294) | def typestring(self, token):
    method error (line 297) | def error(self, token):
    method parse (line 301) | def parse(self, tokens):
    method isnullable (line 337) | def isnullable(self, sym):
    method skip (line 344) | def skip(self, (lhs, rhs), pos=0):
    method makeState (line 352) | def makeState(self, state, sym):
    method goto (line 433) | def goto(self, state, sym):
    method gotoT (line 450) | def gotoT(self, state, t):
    method gotoST (line 453) | def gotoST(self, state, st):
    method add (line 460) | def add(self, set, item, i=None, predecessor=None, causal=None):
    method makeSet (line 471) | def makeSet(self, token, sets, i):
    method makeSet_fast (line 508) | def makeSet_fast(self, token, sets, i):
    method predecessor (line 584) | def predecessor(self, key, causal):
    method causal (line 590) | def causal(self, key):
    method deriveEpsilon (line 602) | def deriveEpsilon(self, nt):
    method buildTree (line 616) | def buildTree(self, nt, item, tokens, k):
    method ambiguity (line 649) | def ambiguity(self, rules):
    method resolve (line 666) | def resolve(self, list):
  class GenericASTBuilder (line 682) | class GenericASTBuilder(GenericParser):
    method __init__ (line 683) | def __init__(self, AST, start):
    method preprocess (line 687) | def preprocess(self, rule, func):
    method buildASTNode (line 694) | def buildASTNode(self, args, lhs):
    method terminal (line 703) | def terminal(self, token):      return token
    method nonterminal (line 705) | def nonterminal(self, type, args):
  class GenericASTTraversalPruningException (line 720) | class GenericASTTraversalPruningException:
  class GenericASTTraversal (line 723) | class GenericASTTraversal:
    method __init__ (line 724) | def __init__(self, ast):
    method typestring (line 727) | def typestring(self, node):
    method prune (line 730) | def prune(self):
    method preorder (line 733) | def preorder(self, node=None):
    method postorder (line 755) | def postorder(self, node=None):
    method default (line 770) | def default(self, node):
  class GenericASTMatcher (line 780) | class GenericASTMatcher(GenericParser):
    method __init__ (line 781) | def __init__(self, start, ast):
    method preprocess (line 785) | def preprocess(self, rule, func):
    method foundMatch (line 795) | def foundMatch(self, args, func):
    method match_r (line 799) | def match_r(self, node):
    method match (line 812) | def match(self, ast=None):
    method resolve (line 820) | def resolve(self, list):
  function _dump (line 826) | def _dump(tokens, sets, states):

FILE: ast27/Parser/tokenizer.c
  type tok_state (line 34) | struct tok_state
  type tok_state (line 35) | struct tok_state
  type tok_state (line 36) | struct tok_state
  type tok_state (line 107) | struct tok_state
  type tok_state (line 110) | struct tok_state
  type tok_state (line 110) | struct tok_state
  type tok_state (line 111) | struct tok_state
  type tok_state (line 157) | struct tok_state
  function decoding_feof (line 162) | static int
  type tok_state (line 169) | struct tok_state
  type tok_state (line 177) | struct tok_state
  function check_coding_spec (line 269) | static int
  function check_bom (line 336) | static int
  type tok_state (line 411) | struct tok_state
  function fp_setreadl (line 479) | static int
  function fp_getc (line 528) | static int fp_getc(struct tok_state *tok) {
  function fp_ungetc (line 534) | static void fp_ungetc(int c, struct tok_state *tok) {
  type tok_state (line 542) | struct tok_state
  function decoding_feof (line 599) | static int
  function buf_getc (line 621) | static int
  function buf_ungetc (line 628) | static void
  function buf_setreadl (line 637) | static int
  function PyObject (line 647) | static PyObject *
  type tok_state (line 661) | struct tok_state
  type tok_state (line 705) | struct tok_state
  type tok_state (line 768) | struct tok_state
  type tok_state (line 771) | struct tok_state
  type tok_state (line 786) | struct tok_state
  type tok_state (line 789) | struct tok_state
  type tok_state (line 812) | struct tok_state
  type tok_state (line 815) | struct tok_state
  function Ta27Tokenizer_Free (line 833) | void
  function tok_nextc (line 851) | static int
  function tok_backup (line 1045) | static void
  function Ta27Token_OneChar (line 1059) | int
  function Ta27Token_TwoChars (line 1092) | int
  function Ta27Token_ThreeChars (line 1166) | int
  function indenterror (line 1214) | static int
  function tok_get (line 1232) | static int
  function Ta27Tokenizer_Get (line 1749) | int
  type tok_state (line 1766) | struct tok_state
  function PyObject (line 1772) | static PyObject *
  type tok_state (line 1786) | struct tok_state
  function tok_dump (line 1824) | void

FILE: ast27/Parser/tokenizer.h
  type tok_state (line 16) | struct tok_state {
  type tok_state (line 58) | struct tok_state
  type tok_state (line 59) | struct tok_state
  type tok_state (line 60) | struct tok_state
  type tok_state (line 61) | struct tok_state
  type tok_state (line 62) | struct tok_state
  type tok_state (line 64) | struct tok_state

FILE: ast27/Python/Python-ast.c
  function ast_type_init (line 403) | static int
  function PyObject (line 455) | static PyObject *
  function PyTypeObject (line 522) | static PyTypeObject* make_type(char *type, PyTypeObject* base, char**fie...
  function add_attributes (line 542) | static int add_attributes(PyTypeObject* type, char**attrs, int num_fields)
  function PyObject (line 563) | static PyObject* ast2obj_list(asdl_seq *seq, PyObject* (*func)(void*))
  function PyObject (line 581) | static PyObject* ast2obj_object(void *o)
  function PyObject (line 590) | static PyObject* ast2obj_bool(bool b)
  function PyObject (line 595) | static PyObject* ast2obj_int(long b)
  function obj2ast_object (line 602) | static int obj2ast_object(PyObject* obj, PyObject** out, PyArena* arena)
  function obj2ast_identifier (line 613) | static int obj2ast_identifier(PyObject* obj, PyObject** out, PyArena* ar...
  function obj2ast_string (line 623) | static int obj2ast_string(PyObject* obj, PyObject** out, PyArena* arena)
  function obj2ast_int (line 633) | static int obj2ast_int(PyObject* obj, int* out, PyArena* arena)
  function obj2ast_bool (line 652) | static int obj2ast_bool(PyObject* obj, bool* out, PyArena* arena)
  function add_ast_fields (line 667) | static int add_ast_fields(void)
  function init_types (line 685) | static int init_types(void)
  function mod_ty (line 1000) | mod_ty
  function mod_ty (line 1013) | mod_ty
  function mod_ty (line 1025) | mod_ty
  function mod_ty (line 1042) | mod_ty
  function mod_ty (line 1060) | mod_ty
  function stmt_ty (line 1072) | stmt_ty
  function stmt_ty (line 1101) | stmt_ty
  function stmt_ty (line 1124) | stmt_ty
  function stmt_ty (line 1138) | stmt_ty
  function stmt_ty (line 1152) | stmt_ty
  function stmt_ty (line 1174) | stmt_ty
  function stmt_ty (line 1205) | stmt_ty
  function stmt_ty (line 1221) | stmt_ty
  function stmt_ty (line 1250) | stmt_ty
  function stmt_ty (line 1271) | stmt_ty
  function stmt_ty (line 1292) | stmt_ty
  function stmt_ty (line 1315) | stmt_ty
  function stmt_ty (line 1331) | stmt_ty
  function stmt_ty (line 1348) | stmt_ty
  function stmt_ty (line 1363) | stmt_ty
  function stmt_ty (line 1383) | stmt_ty
  function stmt_ty (line 1397) | stmt_ty
  function stmt_ty (line 1414) | stmt_ty
  function stmt_ty (line 1435) | stmt_ty
  function stmt_ty (line 1449) | stmt_ty
  function stmt_ty (line 1468) | stmt_ty
  function stmt_ty (line 1481) | stmt_ty
  function stmt_ty (line 1494) | stmt_ty
  function expr_ty (line 1507) | expr_ty
  function expr_ty (line 1527) | expr_ty
  function expr_ty (line 1558) | expr_ty
  function expr_ty (line 1583) | expr_ty
  function expr_ty (line 1608) | expr_ty
  function expr_ty (line 1639) | expr_ty
  function expr_ty (line 1654) | expr_ty
  function expr_ty (line 1668) | expr_ty
  function expr_ty (line 1688) | expr_ty
  function expr_ty (line 1708) | expr_ty
  function expr_ty (line 1735) | expr_ty
  function expr_ty (line 1755) | expr_ty
  function expr_ty (line 1769) | expr_ty
  function expr_ty (line 1791) | expr_ty
  function expr_ty (line 1815) | expr_ty
  function expr_ty (line 1834) | expr_ty
  function expr_ty (line 1853) | expr_ty
  function expr_ty (line 1878) | expr_ty
  function expr_ty (line 1910) | expr_ty
  function expr_ty (line 1942) | expr_ty
  function expr_ty (line 1967) | expr_ty
  function expr_ty (line 1987) | expr_ty
  function slice_ty (line 2007) | slice_ty
  function slice_ty (line 2018) | slice_ty
  function slice_ty (line 2032) | slice_ty
  function slice_ty (line 2044) | slice_ty
  function comprehension_ty (line 2061) | comprehension_ty
  function excepthandler_ty (line 2084) | excepthandler_ty
  function arguments_ty (line 2101) | arguments_ty
  function keyword_ty (line 2117) | keyword_ty
  function alias_ty (line 2139) | alias_ty
  function type_ignore_ty (line 2156) | type_ignore_ty
  function PyObject (line 2175) | PyObject*
  function PyObject (line 2249) | PyObject*
  function PyObject (line 2640) | PyObject*
  function PyObject (line 3008) | PyObject* ast2obj_expr_context(expr_context_ty o)
  function PyObject (line 3035) | PyObject*
  function PyObject (line 3095) | PyObject* ast2obj_boolop(boolop_ty o)
  function PyObject (line 3110) | PyObject* ast2obj_operator(operator_ty o)
  function PyObject (line 3155) | PyObject* ast2obj_unaryop(unaryop_ty o)
  function PyObject (line 3176) | PyObject* ast2obj_cmpop(cmpop_ty o)
  function PyObject (line 3215) | PyObject*
  function PyObject (line 3249) | PyObject*
  function PyObject (line 3297) | PyObject*
  function PyObject (line 3341) | PyObject*
  function PyObject (line 3370) | PyObject*
  function PyObject (line 3399) | PyObject*
  function obj2ast_mod (line 3433) | int
  function obj2ast_stmt (line 3659) | int
  function obj2ast_expr (line 4923) | int
  function obj2ast_expr_context (line 6005) | int
  function obj2ast_slice (line 6068) | int
  function obj2ast_boolop (line 6203) | int
  function obj2ast_operator (line 6234) | int
  function obj2ast_unaryop (line 6345) | int
  function obj2ast_cmpop (line 6392) | int
  function obj2ast_comprehension (line 6487) | int
  function obj2ast_excepthandler (line 6551) | int
  function obj2ast_arguments (line 6657) | int
  function obj2ast_keyword (line 6771) | int
  function obj2ast_alias (line 6809) | int
  function obj2ast_type_ignore (line 6846) | int
  type PyModuleDef (line 6908) | struct PyModuleDef
  function PyMODINIT_FUNC (line 6911) | PyMODINIT_FUNC
  function PyObject (line 7034) | PyObject* Ta27AST_mod2obj(mod_ty t)
  function mod_ty (line 7041) | mod_ty Ta27AST_obj2mod(PyObject* ast, PyArena* arena, int mode)
  function Ta27AST_Check (line 7074) | int Ta27AST_Check(PyObject* obj)

FILE: ast27/Python/asdl.c
  function asdl_seq (line 4) | asdl_seq *
  function asdl_int_seq (line 35) | asdl_int_seq *

FILE: ast27/Python/ast.c
  type compiling (line 19) | struct compiling {
  type compiling (line 26) | struct compiling
  type compiling (line 27) | struct compiling
  type compiling (line 28) | struct compiling
  type compiling (line 29) | struct compiling
  type compiling (line 30) | struct compiling
  type compiling (line 32) | struct compiling
  type compiling (line 33) | struct compiling
  type compiling (line 34) | struct compiling
  type compiling (line 37) | struct compiling
  type compiling (line 39) | struct compiling
  type compiling (line 40) | struct compiling
  type compiling (line 41) | struct compiling
  function identifier (line 55) | static identifier
  function string (line 65) | static string
  function ast_error (line 81) | static int
  function ast_error_finish (line 92) | static void
  function ast_warn (line 133) | static int
  function forbidden_check (line 146) | static int
  function num_stmts (line 178) | static int
  function mod_ty (line 232) | mod_ty
  function operator_ty (line 402) | static operator_ty
  function set_context (line 440) | static int
  function operator_ty (line 558) | static operator_ty
  function cmpop_ty (line 596) | static cmpop_ty
  function asdl_seq (line 648) | static asdl_seq *
  function expr_ty (line 678) | static expr_ty
  function arguments_ty (line 733) | static arguments_ty
  function expr_ty (line 914) | static expr_ty
  function expr_ty (line 946) | static expr_ty
  function asdl_seq (line 982) | static asdl_seq*
  function stmt_ty (line 1003) | static stmt_ty
  function stmt_ty (line 1045) | static stmt_ty
  function expr_ty (line 1075) | static expr_ty
  function expr_ty (line 1102) | static expr_ty
  function count_list_fors (line 1132) | static int
  function count_list_ifs (line 1169) | static int
  function expr_ty (line 1187) | static expr_ty
  function count_comp_fors (line 1290) | static int
  function count_comp_ifs (line 1327) | static int
  function asdl_seq (line 1345) | static asdl_seq *
  function expr_ty (line 1422) | static expr_ty
  function expr_ty (line 1447) | static expr_ty
  function expr_ty (line 1471) | static expr_ty
  function expr_ty (line 1478) | static expr_ty
  function expr_ty (line 1485) | static expr_ty
  function slice_ty (line 1654) | static slice_ty
  function expr_ty (line 1736) | static expr_ty
  function expr_ty (line 1788) | static expr_ty
  function expr_ty (line 1864) | static expr_ty
  function expr_ty (line 1920) | static expr_ty
  function expr_ty (line 1959) | static expr_ty
  function expr_ty (line 2115) | static expr_ty
  function expr_ty (line 2251) | static expr_ty
  function expr_ty (line 2278) | static expr_ty
  function asdl_seq (line 2290) | static asdl_seq*
  function stmt_ty (line 2311) | static stmt_ty
  function stmt_ty (line 2420) | static stmt_ty
  function asdl_seq (line 2454) | static asdl_seq *
  function stmt_ty (line 2477) | static stmt_ty
  function stmt_ty (line 2491) | static stmt_ty
  function alias_ty (line 2576) | static alias_ty
  function stmt_ty (line 2695) | static stmt_ty
  function stmt_ty (line 2804) | static stmt_ty
  function stmt_ty (line 2825) | static stmt_ty
  function stmt_ty (line 2869) | static stmt_ty
  function asdl_seq (line 2899) | static asdl_seq *
  function stmt_ty (line 2969) | static stmt_ty
  function stmt_ty (line 3089) | static stmt_ty
  function stmt_ty (line 3132) | static stmt_ty
  function excepthandler_ty (line 3180) | static excepthandler_ty
  function stmt_ty (line 3234) | static stmt_ty
  function stmt_ty (line 3313) | static stmt_ty
  function stmt_ty (line 3338) | static stmt_ty
  function stmt_ty (line 3382) | static stmt_ty
  function stmt_ty (line 3431) | static stmt_ty
  function PyObject (line 3506) | static PyObject *
  function PyObject (line 3566) | static PyObject *
  function PyObject (line 3584) | static PyObject *
  function PyObject (line 3654) | static PyObject *
  function PyObject (line 3748) | static PyObject *

FILE: ast27/Python/mystrtoul.c
  function Ta27OS_strtoul (line 120) | unsigned long
  function Ta27OS_strtol (line 282) | long

FILE: ast3/Custom/typed_ast.c
  function PARSER_FLAGS (line 65) | static int PARSER_FLAGS(PyCompilerFlags *flags)
  function err_input (line 81) | static void
  function err_free (line 208) | static void
  function node (line 215) | node *
  function _Ta3Parser_UpdateFlags (line 229) | void
  function mod_ty (line 240) | static mod_ty
  function PyObject (line 273) | static PyObject *
  function PyObject (line 295) | static PyObject *
  function PyObject (line 340) | PyObject *

FILE: ast3/Include/Python-ast.h
  type _mod (line 5) | struct _mod
  type _stmt (line 7) | struct _stmt
  type _expr (line 9) | struct _expr
  type expr_context_ty (line 11) | typedef enum _expr_context { Load=1, Store=2, Del=3, AugLoad=4, AugStore=5,
  type _slice (line 14) | struct _slice
  type boolop_ty (line 16) | typedef enum _boolop { And=1, Or=2 } boolop_ty;
  type operator_ty (line 18) | typedef enum _operator { Add=1, Sub=2, Mult=3, MatMult=4, Div=5, Mod=6, ...
  type unaryop_ty (line 22) | typedef enum _unaryop { Invert=1, Not=2, UAdd=3, USub=4 } unaryop_ty;
  type cmpop_ty (line 24) | typedef enum _cmpop { Eq=1, NotEq=2, Lt=3, LtE=4, Gt=5, GtE=6, Is=7, IsN...
  type _comprehension (line 27) | struct _comprehension
  type _excepthandler (line 29) | struct _excepthandler
  type _arguments (line 31) | struct _arguments
  type _arg (line 33) | struct _arg
  type _keyword (line 35) | struct _keyword
  type _alias (line 37) | struct _alias
  type _withitem (line 39) | struct _withitem
  type _type_ignore (line 41) | struct _type_ignore
  type _mod_kind (line 44) | enum _mod_kind {Module_kind=1, Interactive_kind=2, Expression_kind=3,
  type _mod (line 46) | struct _mod {
  type _stmt_kind (line 74) | enum _stmt_kind {FunctionDef_kind=1, AsyncFunctionDef_kind=2, ClassDef_k...
  type _stmt (line 82) | struct _stmt {
  type _expr_kind (line 222) | enum _expr_kind {BoolOp_kind=1, BinOp_kind=2, UnaryOp_kind=3, Lambda_kin...
  type _expr (line 231) | struct _expr {
  type _slice_kind (line 384) | enum _slice_kind {Slice_kind=1, ExtSlice_kind=2, Index_kind=3}
  type _slice (line 385) | struct _slice {
  type _comprehension (line 405) | struct _comprehension {
  type _excepthandler_kind (line 412) | enum _excepthandler_kind {ExceptHandler_kind=1}
  type _excepthandler (line 413) | struct _excepthandler {
  type _arguments (line 427) | struct _arguments {
  type _arg (line 436) | struct _arg {
  type _keyword (line 444) | struct _keyword {
  type _alias (line 449) | struct _alias {
  type _withitem (line 454) | struct _withitem {
  type _type_ignore_kind (line 459) | enum _type_ignore_kind {TypeIgnore_kind=1}
  type _type_ignore (line 460) | struct _type_ignore {

FILE: ast3/Include/asdl.h
  type PyObject (line 6) | typedef PyObject * identifier;
  type PyObject (line 7) | typedef PyObject * string;
  type PyObject (line 8) | typedef PyObject * bytes;
  type PyObject (line 9) | typedef PyObject * object;
  type PyObject (line 10) | typedef PyObject * singleton;
  type PyObject (line 11) | typedef PyObject * constant;
  type asdl_seq (line 21) | typedef struct {
  type asdl_int_seq (line 26) | typedef struct {

FILE: ast3/Include/bitset.h
  type BYTE (line 12) | typedef BYTE *bitset;

FILE: ast3/Include/grammar.h
  type label (line 14) | typedef struct {
  type labellist (line 23) | typedef struct {
  type arc (line 30) | typedef struct {
  type state (line 37) | typedef struct {
  type dfa (line 50) | typedef struct {
  type grammar (line 61) | typedef struct {

FILE: ast3/Include/node.h
  type node (line 10) | typedef struct _node {

FILE: ast3/Include/parsetok.h
  type perrdetail (line 10) | typedef struct {

FILE: ast3/Include/pycore_pyarena.h
  type PyArena (line 10) | typedef struct _arena PyArena;

FILE: ast3/Parser/acceler.c
  function Ta3Grammar_AddAccelerators (line 23) | void
  function Ta3Grammar_RemoveAccelerators (line 34) | void
  function fixdfa (line 53) | static void
  function fixstate (line 63) | static void

FILE: ast3/Parser/asdl.py
  class AST (line 39) | class AST:
    method __repr__ (line 40) | def __repr__(self):
  class Module (line 43) | class Module(AST):
    method __init__ (line 44) | def __init__(self, name, dfns):
    method __repr__ (line 49) | def __repr__(self):
  class Type (line 52) | class Type(AST):
    method __init__ (line 53) | def __init__(self, name, value):
    method __repr__ (line 57) | def __repr__(self):
  class Constructor (line 60) | class Constructor(AST):
    method __init__ (line 61) | def __init__(self, name, fields=None):
    method __repr__ (line 65) | def __repr__(self):
  class Field (line 68) | class Field(AST):
    method __init__ (line 69) | def __init__(self, type, name=None, seq=False, opt=False):
    method __repr__ (line 75) | def __repr__(self):
  class Sum (line 87) | class Sum(AST):
    method __init__ (line 88) | def __init__(self, types, attributes=None):
    method __repr__ (line 92) | def __repr__(self):
  class Product (line 98) | class Product(AST):
    method __init__ (line 99) | def __init__(self, fields, attributes=None):
    method __repr__ (line 103) | def __repr__(self):
  class VisitorBase (line 115) | class VisitorBase(object):
    method __init__ (line 117) | def __init__(self):
    method visit (line 120) | def visit(self, obj, *args):
  class Check (line 134) | class Check(VisitorBase):
    method __init__ (line 139) | def __init__(self):
    method visitModule (line 145) | def visitModule(self, mod):
    method visitType (line 149) | def visitType(self, type):
    method visitSum (line 152) | def visitSum(self, sum, name):
    method visitConstructor (line 156) | def visitConstructor(self, cons, name):
    method visitField (line 168) | def visitField(self, field, name):
    method visitProduct (line 173) | def visitProduct(self, prod, name):
  function check (line 177) | def check(mod):
  function parse (line 196) | def parse(filename):
  class TokenKind (line 203) | class TokenKind:
  class ASDLSyntaxError (line 214) | class ASDLSyntaxError(Exception):
    method __init__ (line 215) | def __init__(self, msg, lineno=None):
    method __str__ (line 219) | def __str__(self):
  function tokenize_asdl (line 222) | def tokenize_asdl(buf):
  class ASDLParser (line 244) | class ASDLParser:
    method __init__ (line 251) | def __init__(self):
    method parse (line 255) | def parse(self, buf):
    method _parse_module (line 262) | def _parse_module(self):
    method _parse_definitions (line 275) | def _parse_definitions(self):
    method _parse_type (line 284) | def _parse_type(self):
    method _parse_product (line 300) | def _parse_product(self):
    method _parse_fields (line 303) | def _parse_fields(self):
    method _parse_optional_fields (line 319) | def _parse_optional_fields(self):
    method _parse_optional_attributes (line 325) | def _parse_optional_attributes(self):
    method _parse_optional_field_quantifier (line 332) | def _parse_optional_field_quantifier(self):
    method _advance (line 342) | def _advance(self):
    method _match (line 355) | def _match(self, kind):
    method _at_keyword (line 374) | def _at_keyword(self, keyword):

FILE: ast3/Parser/asdl_c.py
  function get_c_type (line 11) | def get_c_type(name):
  function reflow_lines (line 21) | def reflow_lines(s, depth):
  function is_simple (line 63) | def is_simple(sum):
  class EmitVisitor (line 75) | class EmitVisitor(asdl.VisitorBase):
    method __init__ (line 78) | def __init__(self, file):
    method emit_identifier (line 83) | def emit_identifier(self, name):
    method emit (line 90) | def emit(self, s, depth, reflow=True):
  class TypeDefVisitor (line 102) | class TypeDefVisitor(EmitVisitor):
    method visitModule (line 103) | def visitModule(self, mod):
    method visitType (line 107) | def visitType(self, type, depth=0):
    method visitSum (line 110) | def visitSum(self, sum, name, depth):
    method simple_sum (line 116) | def simple_sum(self, sum, name, depth):
    method sum_with_constructors (line 127) | def sum_with_constructors(self, sum, name, depth):
    method visitProduct (line 133) | def visitProduct(self, product, name, depth):
  class StructVisitor (line 140) | class StructVisitor(EmitVisitor):
    method visitModule (line 143) | def visitModule(self, mod):
    method visitType (line 147) | def visitType(self, type, depth=0):
    method visitSum (line 150) | def visitSum(self, sum, name, depth):
    method sum_with_constructors (line 154) | def sum_with_constructors(self, sum, name, depth):
    method visitConstructor (line 178) | def visitConstructor(self, cons, depth):
    method visitField (line 186) | def visitField(self, field, depth):
    method visitProduct (line 199) | def visitProduct(self, product, name, depth):
  class PrototypeVisitor (line 212) | class PrototypeVisitor(EmitVisitor):
    method visitModule (line 215) | def visitModule(self, mod):
    method visitType (line 219) | def visitType(self, type):
    method visitSum (line 222) | def visitSum(self, sum, name):
    method get_args (line 229) | def get_args(self, fields):
    method visitConstructor (line 256) | def visitConstructor(self, cons, type, attrs):
    method emit_function (line 262) | def emit_function(self, name, ctype, args, attrs, union=True):
    method visitProduct (line 277) | def visitProduct(self, prod, name):
  class FunctionVisitor (line 284) | class FunctionVisitor(PrototypeVisitor):
    method emit_function (line 287) | def emit_function(self, name, ctype, args, attrs, union=True):
    method emit_body_union (line 321) | def emit_body_union(self, name, args, attrs):
    method emit_body_struct (line 330) | def emit_body_struct(self, name, args, attrs):
  class PickleVisitor (line 339) | class PickleVisitor(EmitVisitor):
    method visitModule (line 341) | def visitModule(self, mod):
    method visitType (line 345) | def visitType(self, type):
    method visitSum (line 348) | def visitSum(self, sum, name):
    method visitProduct (line 351) | def visitProduct(self, sum, name):
    method visitConstructor (line 354) | def visitConstructor(self, cons, name):
    method visitField (line 357) | def visitField(self, sum):
  class Obj2ModPrototypeVisitor (line 361) | class Obj2ModPrototypeVisitor(PickleVisitor):
    method visitProduct (line 362) | def visitProduct(self, prod, name):
  class Obj2ModVisitor (line 369) | class Obj2ModVisitor(PickleVisitor):
    method funcHeader (line 370) | def funcHeader(self, name):
    method sumTrailer (line 378) | def sumTrailer(self, name, add_label=False):
    method simpleSum (line 391) | def simpleSum(self, sum, name):
    method buildArgs (line 406) | def buildArgs(self, fields):
    method complexSum (line 409) | def complexSum(self, sum, name):
    method visitAttributeDeclaration (line 441) | def visitAttributeDeclaration(self, a, name, sum=sum):
    method visitSum (line 445) | def visitSum(self, sum, name):
    method visitProduct (line 451) | def visitProduct(self, prod, name):
    method visitFieldDeclaration (line 476) | def visitFieldDeclaration(self, field, name, sum=None, prod=None, dept...
    method isSimpleSum (line 487) | def isSimpleSum(self, field):
    method isNumeric (line 492) | def isNumeric(self, field):
    method isSimpleType (line 495) | def isSimpleType(self, field):
    method visitField (line 498) | def visitField(self, field, name, sum=None, prod=None, depth=0):
  class MarshalPrototypeVisitor (line 561) | class MarshalPrototypeVisitor(PickleVisitor):
    method prototype (line 563) | def prototype(self, sum, name):
  class PyTypesDeclareVisitor (line 571) | class PyTypesDeclareVisitor(PickleVisitor):
    method visitProduct (line 573) | def visitProduct(self, prod, name):
    method visitSum (line 591) | def visitSum(self, sum, name):
    method visitConstructor (line 612) | def visitConstructor(self, cons, name):
  class PyTypesVisitor (line 622) | class PyTypesVisitor(PickleVisitor):
    method visitModule (line 624) | def visitModule(self, mod):
    method visitProduct (line 986) | def visitProduct(self, prod, name):
    method visitSum (line 1000) | def visitSum(self, sum, name):
    method visitConstructor (line 1013) | def visitConstructor(self, cons, name, simple):
  class ASTModuleVisitor (line 1027) | class ASTModuleVisitor(PickleVisitor):
    method visitModule (line 1029) | def visitModule(self, mod):
    method visitProduct (line 1054) | def visitProduct(self, prod, name):
    method visitSum (line 1057) | def visitSum(self, sum, name):
    method visitConstructor (line 1062) | def visitConstructor(self, cons, name):
    method addObj (line 1065) | def addObj(self, name):
  function find_sequence (line 1071) | def find_sequence(fields, doing_specialization):
  function has_sequence (line 1081) | def has_sequence(types, doing_specialization):
  class StaticVisitor (line 1088) | class StaticVisitor(PickleVisitor):
    method visit (line 1091) | def visit(self, object):
  class ObjVisitor (line 1095) | class ObjVisitor(PickleVisitor):
    method func_begin (line 1097) | def func_begin(self, name):
    method func_end (line 1109) | def func_end(self):
    method visitSum (line 1118) | def visitSum(self, sum, name):
    method simpleSum (line 1136) | def simpleSum(self, sum, name):
    method visitProduct (line 1152) | def visitProduct(self, prod, name):
    method visitConstructor (line 1166) | def visitConstructor(self, cons, enum, name):
    method visitField (line 1174) | def visitField(self, field, name, depth, product):
    method emitSeq (line 1187) | def emitSeq(self, field, value, depth, emit):
    method set (line 1199) | def set(self, field, value, depth):
  class PartingShots (line 1221) | class PartingShots(StaticVisitor):
  class ChainOfVisitors (line 1270) | class ChainOfVisitors:
    method __init__ (line 1271) | def __init__(self, *visitors):
    method visit (line 1274) | def visit(self, object):
  function main (line 1281) | def main(srcfile, dump_module=False):

FILE: ast3/Parser/bitset.c
  function bitset (line 7) | bitset
  function delbitset (line 22) | void
  function addbit (line 28) | int
  function testbit (line 41) | int
  function samebitset (line 48) | int
  function mergebitset (line 59) | void

FILE: ast3/Parser/grammar.c
  function grammar (line 14) | grammar *
  function freegrammar (line 31) | void
  function dfa (line 48) | dfa *
  function addstate (line 67) | int
  function addarc (line 86) | void
  function addlabel (line 104) | int
  function findlabel (line 130) | int
  function translatelabels (line 154) | void
  function translabel (line 167) | static void

FILE: ast3/Parser/grammar1.c
  function dfa (line 11) | dfa *

FILE: ast3/Parser/node.c
  function node (line 7) | node *
  function fancy_roundup (line 22) | static int
  function Ta3Node_AddChild (line 78) | int
  function Ta3Node_Free (line 120) | void
  function Py_ssize_t (line 129) | Py_ssize_t
  function freechildren (line 139) | static void
  function Py_ssize_t (line 151) | static Py_ssize_t

FILE: ast3/Parser/parser.c
  function s_reset (line 29) | static void
  function s_push (line 37) | static int
  function s_pop (line 54) | static void
  function parser_state (line 71) | parser_state *
  function Ta3Parser_Delete (line 95) | void
  function shift (line 107) | static int
  function push (line 119) | static int
  function classify (line 136) | static int
  function future_hack (line 184) | static void
  function Ta3Parser_AddToken (line 226) | int
  function dumptree (line 340) | void
  function showtree (line 364) | void

FILE: ast3/Parser/parser.h
  type stackentry (line 12) | typedef struct {
  type stack (line 18) | typedef struct {
  type parser_state (line 24) | typedef struct {

FILE: ast3/Parser/parsetok.c
  type tok_state (line 15) | struct tok_state
  function node (line 19) | node *
  function node (line 25) | node *
  function node (line 33) | node *
  function node (line 43) | node *
  function node (line 72) | node *
  function node (line 97) | node *
  function node (line 106) | node *
  function node (line 117) | node *
  function node (line 139) | node *
  function warn (line 172) | static void
  type growable_comment_array (line 182) | typedef struct {
  function growable_comment_array_init (line 191) | static int
  function growable_comment_array_add (line 201) | static int
  function growable_comment_array_deallocate (line 217) | static void
  function node (line 229) | static node *
  function initerr (line 447) | static int

FILE: ast3/Parser/tokenizer.c
  type tok_state (line 59) | struct tok_state
  type tok_state (line 60) | struct tok_state
  type tok_state (line 61) | struct tok_state
  type tok_state (line 140) | struct tok_state
  type tok_state (line 143) | struct tok_state
  type tok_state (line 143) | struct tok_state
  type tok_state (line 144) | struct tok_state
  type tok_state (line 182) | struct tok_state
  type tok_state (line 197) | struct tok_state
  function decoding_feof (line 202) | static int
  type tok_state (line 209) | struct tok_state
  type tok_state (line 217) | struct tok_state
  function get_coding_spec (line 259) | static int
  function check_coding_spec (line 313) | static int
  function check_bom (line 372) | static int
  type tok_state (line 450) | struct tok_state
  function fp_setreadl (line 520) | static int
  function fp_getc (line 570) | static int fp_getc(struct tok_state *tok) {
  function fp_ungetc (line 576) | static void fp_ungetc(int c, struct tok_state *tok) {
  function valid_utf8 (line 583) | static int valid_utf8(const unsigned char* s)
  type tok_state (line 612) | struct tok_state
  function decoding_feof (line 668) | static int
  function buf_getc (line 690) | static int
  function buf_ungetc (line 697) | static void
  function buf_setreadl (line 706) | static int
  function PyObject (line 715) | static PyObject *
  type tok_state (line 728) | struct tok_state
  type tok_state (line 773) | struct tok_state
  type tok_state (line 832) | struct tok_state
  type tok_state (line 835) | struct tok_state
  type tok_state (line 849) | struct tok_state
  type tok_state (line 852) | struct tok_state
  type tok_state (line 880) | struct tok_state
  type tok_state (line 884) | struct tok_state
  function Ta3Tokenizer_Free (line 913) | void
  function tok_nextc (line 932) | static int
  function tok_backup (line 1131) | static void
  function Ta3Token_OneChar (line 1145) | int
  function Ta3Token_TwoChars (line 1177) | int
  function Ta3Token_ThreeChars (line 1256) | int
  function indenterror (line 1314) | static int
  function verify_identifier (line 1328) | static int
  function tok_decimal_tail (line 1353) | static int
  function tok_get (line 1377) | static int
  function Ta3Tokenizer_Get (line 1962) | int
  type tok_state (line 1988) | struct tok_state
  function tok_dump (line 2045) | void

FILE: ast3/Parser/tokenizer.h
  type decoding_state (line 15) | enum decoding_state {
  type tok_state (line 22) | struct tok_state {
  type tok_state (line 75) | struct tok_state
  type tok_state (line 76) | struct tok_state
  type tok_state (line 77) | struct tok_state
  type tok_state (line 79) | struct tok_state
  type tok_state (line 80) | struct tok_state

FILE: ast3/Python/Python-ast.c
  type AST_object (line 538) | typedef struct {
  function ast_dealloc (line 543) | static void
  function ast_traverse (line 552) | static int
  function ast_clear (line 559) | static int
  function lookup_attr_id (line 566) | static int lookup_attr_id(PyObject *v, _Py_Identifier *name, PyObject **...
  function ast_type_init (line 583) | static int
  function PyObject (line 633) | static PyObject *
  function PyTypeObject (line 700) | static PyTypeObject* make_type(char *type, PyTypeObject* base, char**fie...
  function add_attributes (line 725) | static int add_attributes(PyTypeObject* type, char**attrs, int num_fields)
  function PyObject (line 746) | static PyObject* ast2obj_list(asdl_seq *seq, PyObject* (*func)(void*))
  function PyObject (line 764) | static PyObject* ast2obj_object(void *o)
  function PyObject (line 777) | static PyObject* ast2obj_int(long b)
  function obj2ast_singleton (line 784) | static int obj2ast_singleton(PyObject *obj, PyObject** out, PyArena* arena)
  function obj2ast_object (line 795) | static int obj2ast_object(PyObject* obj, PyObject** out, PyArena* arena)
  function obj2ast_constant (line 810) | static int obj2ast_constant(PyObject* obj, PyObject** out, PyArena* arena)
  function obj2ast_identifier (line 823) | static int obj2ast_identifier(PyObject* obj, PyObject** out, PyArena* ar...
  function obj2ast_string (line 832) | static int obj2ast_string(PyObject* obj, PyObject** out, PyArena* arena)
  function obj2ast_bytes (line 841) | static int obj2ast_bytes(PyObject* obj, PyObject** out, PyArena* arena)
  function obj2ast_int (line 850) | static int obj2ast_int(PyObject* obj, int* out, PyArena* arena)
  function add_ast_fields (line 865) | static int add_ast_fields(void)
  function init_types (line 883) | static int init_types(void)
  function mod_ty (line 1246) | mod_ty
  function mod_ty (line 1259) | mod_ty
  function mod_ty (line 1271) | mod_ty
  function mod_ty (line 1288) | mod_ty
  function mod_ty (line 1306) | mod_ty
  function stmt_ty (line 1318) | stmt_ty
  function stmt_ty (line 1349) | stmt_ty
  function stmt_ty (line 1380) | stmt_ty
  function stmt_ty (line 1405) | stmt_ty
  function stmt_ty (line 1419) | stmt_ty
  function stmt_ty (line 1433) | stmt_ty
  function stmt_ty (line 1455) | stmt_ty
  function stmt_ty (line 1487) | stmt_ty
  function stmt_ty (line 1515) | stmt_ty
  function stmt_ty (line 1544) | stmt_ty
  function stmt_ty (line 1573) | stmt_ty
  function stmt_ty (line 1595) | stmt_ty
  function stmt_ty (line 1617) | stmt_ty
  function stmt_ty (line 1634) | stmt_ty
  function stmt_ty (line 1651) | stmt_ty
  function stmt_ty (line 1666) | stmt_ty
  function stmt_ty (line 1684) | stmt_ty
  function stmt_ty (line 1704) | stmt_ty
  function stmt_ty (line 1718) | stmt_ty
  function stmt_ty (line 1735) | stmt_ty
  function stmt_ty (line 1749) | stmt_ty
  function stmt_ty (line 1763) | stmt_ty
  function stmt_ty (line 1782) | stmt_ty
  function stmt_ty (line 1795) | stmt_ty
  function stmt_ty (line 1808) | stmt_ty
  function expr_ty (line 1821) | expr_ty
  function expr_ty (line 1842) | expr_ty
  function expr_ty (line 1874) | expr_ty
  function expr_ty (line 1900) | expr_ty
  function expr_ty (line 1926) | expr_ty
  function expr_ty (line 1958) | expr_ty
  function expr_ty (line 1974) | expr_ty
  function expr_ty (line 1988) | expr_ty
  function expr_ty (line 2009) | expr_ty
  function expr_ty (line 2030) | expr_ty
  function expr_ty (line 2057) | expr_ty
  function expr_ty (line 2078) | expr_ty
  function expr_ty (line 2097) | expr_ty
  function expr_ty (line 2111) | expr_ty
  function expr_ty (line 2130) | expr_ty
  function expr_ty (line 2152) | expr_ty
  function expr_ty (line 2174) | expr_ty
  function expr_ty (line 2193) | expr_ty
  function expr_ty (line 2218) | expr_ty
  function expr_ty (line 2240) | expr_ty
  function expr_ty (line 2254) | expr_ty
  function expr_ty (line 2279) | expr_ty
  function expr_ty (line 2298) | expr_ty
  function expr_ty (line 2311) | expr_ty
  function expr_ty (line 2330) | expr_ty
  function expr_ty (line 2362) | expr_ty
  function expr_ty (line 2394) | expr_ty
  function expr_ty (line 2420) | expr_ty
  function expr_ty (line 2446) | expr_ty
  function expr_ty (line 2467) | expr_ty
  function slice_ty (line 2488) | slice_ty
  function slice_ty (line 2502) | slice_ty
  function slice_ty (line 2514) | slice_ty
  function comprehension_ty (line 2531) | comprehension_ty
  function excepthandler_ty (line 2556) | excepthandler_ty
  function arguments_ty (line 2573) | arguments_ty
  function arg_ty (line 2590) | arg_ty
  function keyword_ty (line 2611) | keyword_ty
  function alias_ty (line 2628) | alias_ty
  function withitem_ty (line 2645) | withitem_ty
  function type_ignore_ty (line 2662) | type_ignore_ty
  function PyObject (line 2681) | PyObject*
  function PyObject (line 2754) | PyObject*
  function PyObject (line 3213) | PyObject*
  function PyObject (line 3658) | PyObject* ast2obj_expr_context(expr_context_ty o)
  function PyObject (line 3685) | PyObject*
  function PyObject (line 3740) | PyObject* ast2obj_boolop(boolop_ty o)
  function PyObject (line 3755) | PyObject* ast2obj_operator(operator_ty o)
  function PyObject (line 3803) | PyObject* ast2obj_unaryop(unaryop_ty o)
  function PyObject (line 3824) | PyObject* ast2obj_cmpop(cmpop_ty o)
  function PyObject (line 3863) | PyObject*
  function PyObject (line 3901) | PyObject*
  function PyObject (line 3948) | PyObject*
  function PyObject (line 3996) | PyObject*
  function PyObject (line 4039) | PyObject*
  function PyObject (line 4067) | PyObject*
  function PyObject (line 4095) | PyObject*
  function PyObject (line 4123) | PyObject*
  function obj2ast_mod (line 4156) | int
  function obj2ast_stmt (line 4407) | int
  function obj2ast_expr (line 6144) | int
  function obj2ast_expr_context (line 7558) | int
  function obj2ast_slice (line 7616) | int
  function obj2ast_boolop (line 7751) | int
  function obj2ast_operator (line 7777) | int
  function obj2ast_unaryop (line 7891) | int
  function obj2ast_cmpop (line 7933) | int
  function obj2ast_comprehension (line 8023) | int
  function obj2ast_excepthandler (line 8108) | int
  function obj2ast_arguments (line 8223) | int
  function obj2ast_arg (line 8388) | int
  function obj2ast_keyword (line 8470) | int
  function obj2ast_alias (line 8510) | int
  function obj2ast_withitem (line 8550) | int
  function obj2ast_type_ignore (line 8590) | int
  type PyModuleDef (line 8652) | struct PyModuleDef
  function PyMODINIT_FUNC (line 8655) | PyMODINIT_FUNC
  function PyObject (line 8867) | PyObject* Ta3AST_mod2obj(mod_ty t)
  function mod_ty (line 8875) | mod_ty Ta3AST_obj2mod(PyObject* ast, PyArena* arena, int mode)
  function Ta3AST_Check (line 8905) | int Ta3AST_Check(PyObject* obj)

FILE: ast3/Python/asdl.c
  function asdl_seq (line 4) | asdl_seq *
  function asdl_int_seq (line 35) | asdl_int_seq *

FILE: ast3/Python/ast.c
  function PyObject (line 24) | static PyObject *
  function PyObject (line 50) | static PyObject *
  function PyObject (line 64) | PyObject *
  function validate_comprehension (line 86) | static int
  function validate_slice (line 104) | static int
  function validate_keywords (line 129) | static int
  function validate_args (line 139) | static int
  function validate_arguments (line 172) | static int
  function validate_constant (line 199) | static int
  function validate_expr (line 245) | static int
  function validate_nonempty_seq (line 417) | static int
  function validate_assignlist (line 426) | static int
  function validate_body (line 433) | static int
  function validate_stmt (line 439) | static int
  function validate_stmts (line 581) | static int
  function validate_exprs (line 600) | static int
  function Ta3AST_Validate (line 620) | int
  type compiling (line 652) | struct compiling {
  type compiling (line 659) | struct compiling
  type compiling (line 660) | struct compiling
  type compiling (line 661) | struct compiling
  type compiling (line 662) | struct compiling
  type compiling (line 663) | struct compiling
  type compiling (line 665) | struct compiling
  type compiling (line 666) | struct compiling
  type compiling (line 668) | struct compiling
  type compiling (line 669) | struct compiling
  type compiling (line 672) | struct compiling
  type compiling (line 674) | struct compiling
  type compiling (line 675) | struct compiling
  function init_normalization (line 681) | static int
  function identifier (line 694) | static identifier
  function string (line 744) | static string
  function ast_error (line 758) | static int
  function num_stmts (line 800) | static int
  function mod_ty (line 853) | mod_ty
  function mod_ty (line 1022) | mod_ty
  function operator_ty (line 1040) | static operator_ty
  function forbidden_name (line 1085) | static int
  function set_context (line 1113) | static int
  function operator_ty (line 1241) | static operator_ty
  function cmpop_ty (line 1286) | static cmpop_ty
  function asdl_seq (line 1340) | static asdl_seq *
  function arg_ty (line 1369) | static arg_ty
  function handle_keywordonly_args (line 1403) | static int
  function arguments_ty (line 1479) | static arguments_ty
  function expr_ty (line 1679) | static expr_ty
  function expr_ty (line 1711) | static expr_ty
  function asdl_seq (line 1747) | static asdl_seq*
  function stmt_ty (line 1768) | static stmt_ty
  function stmt_ty (line 1837) | static stmt_ty
  function stmt_ty (line 1849) | static stmt_ty
  function stmt_ty (line 1858) | static stmt_ty
  function stmt_ty (line 1885) | static stmt_ty
  function expr_ty (line 1918) | static expr_ty
  function expr_ty (line 1946) | static expr_ty
  function count_comp_fors (line 1972) | static int
  function count_comp_ifs (line 2022) | static int
  function asdl_seq (line 2040) | static asdl_seq *
  function expr_ty (line 2137) | static expr_ty
  function ast_for_dictelement (line 2176) | static int
  function expr_ty (line 2212) | static expr_ty
  function expr_ty (line 2231) | static expr_ty
  function expr_ty (line 2264) | static expr_ty
  function expr_ty (line 2271) | static expr_ty
  function expr_ty (line 2278) | static expr_ty
  function expr_ty (line 2285) | static expr_ty
  function expr_ty (line 2307) | static expr_ty
  function slice_ty (line 2470) | static slice_ty
  function expr_ty (line 2535) | static expr_ty
  function expr_ty (line 2587) | static expr_ty
  function expr_ty (line 2663) | static expr_ty
  function expr_ty (line 2688) | static expr_ty
  function expr_ty (line 2737) | static expr_ty
  function expr_ty (line 2758) | static expr_ty
  function expr_ty (line 2776) | static expr_ty
  function expr_ty (line 2941) | static expr_ty
  function expr_ty (line 3106) | static expr_ty
  function stmt_ty (line 3130) | static stmt_ty
  function asdl_seq (line 3320) | static asdl_seq *
  function stmt_ty (line 3343) | static stmt_ty
  function stmt_ty (line 3357) | static stmt_ty
  function alias_ty (line 3417) | static alias_ty
  function stmt_ty (line 3538) | static stmt_ty
  function stmt_ty (line 3651) | static stmt_ty
  function stmt_ty (line 3672) | static stmt_ty
  function stmt_ty (line 3693) | static stmt_ty
  function asdl_seq (line 3722) | static asdl_seq *
  function stmt_ty (line 3792) | static stmt_ty
  function stmt_ty (line 3912) | static stmt_ty
  function stmt_ty (line 3953) | static stmt_ty
  function excepthandler_ty (line 4018) | static excepthandler_ty
  function stmt_ty (line 4072) | static stmt_ty
  function withitem_ty (line 4137) | static withitem_ty
  function stmt_ty (line 4161) | static stmt_ty
  function stmt_ty (line 4209) | static stmt_ty
  function stmt_ty (line 4271) | static stmt_ty
  function PyObject (line 4345) | static PyObject *
  function PyObject (line 4388) | static PyObject *
  function PyObject (line 4416) | static PyObject *
  function warn_invalid_escape_sequence (line 4427) | static int
  function PyObject (line 4459) | static PyObject *
  function PyObject (line 4532) | static PyObject *
  function fstring_shift_node_locations (line 4559) | static void fstring_shift_node_locations(node *n, int lineno, int col_of...
  function fstring_fix_node_location (line 4579) | static void
  function expr_ty (line 4609) | static expr_ty
  function fstring_find_literal (line 4684) | static int
  type compiling (line 4764) | struct compiling
  function fstring_find_expr (line 4776) | static int
  function fstring_find_literal_and_expr (line 4998) | static int
  type ExprList (line 5041) | typedef struct {
  function ExprList_check_invariants (line 5061) | static void
  function ExprList_Init (line 5073) | static void
  function ExprList_Append (line 5085) | static int
  function ExprList_Dealloc (line 5125) | static void
  function asdl_seq (line 5142) | static asdl_seq *
  type FstringParser (line 5163) | typedef struct {
  function FstringParser_check_invariants (line 5172) | static void
  function FstringParser_Init (line 5181) | static void
  function FstringParser_Dealloc (line 5190) | static void
  function PyObject (line 5199) | static PyObject *
  function expr_ty (line 5212) | static expr_ty
  function FstringParser_ConcatAndDel (line 5231) | static int
  function FstringParser_ConcatFstring (line 5258) | static int
  function expr_ty (line 5346) | static expr_ty
  function expr_ty (line 5391) | static expr_ty
  function parsestr (line 5414) | static int
  function expr_ty (line 5529) | static expr_ty

FILE: ast3/tests/test_basics.py
  function test_basics (line 23) | def test_basics():
  function test_redundantdef (line 35) | def test_redundantdef():
  function test_vardecl (line 45) | def test_vardecl():
  function test_forstmt (line 57) | def test_forstmt():
  function test_withstmt (line 67) | def test_withstmt():
  function test_longargs (line 170) | def test_longargs():
  function test_ignores (line 205) | def test_ignores():
  function test_asyncfunc (line 234) | def test_asyncfunc():
  function test_asyncvar (line 247) | def test_asyncvar():
  function test_asynccomp (line 259) | def test_asynccomp():
  function test_matmul (line 270) | def test_matmul():
  function test_strkind (line 284) | def test_strkind():
  function test_convert_strs (line 302) | def test_convert_strs():
  function test_simple_fstring (line 312) | def test_simple_fstring():
  function test_await_fstring (line 322) | def test_await_fstring():

FILE: typed_ast/ast27.py
  function parse (line 45) | def parse(source, filename='<unknown>', mode='exec'):
  function literal_eval (line 53) | def literal_eval(node_or_string):
  function dump (line 96) | def dump(node, annotate_fields=True, include_attributes=False):
  function copy_location (line 126) | def copy_location(new_node, old_node):
  function fix_missing_locations (line 138) | def fix_missing_locations(node):
  function increment_lineno (line 163) | def increment_lineno(node, n=1):
  function iter_fields (line 174) | def iter_fields(node):
  function iter_child_nodes (line 186) | def iter_child_nodes(node):
  function get_docstring (line 200) | def get_docstring(node, clean=True):
  function walk (line 216) | def walk(node):
  class NodeVisitor (line 230) | class NodeVisitor(object):
    method visit (line 250) | def visit(self, node):
    method generic_visit (line 256) | def generic_visit(self, node):
  class NodeTransformer (line 267) | class NodeTransformer(NodeVisitor):
    method generic_visit (line 303) | def generic_visit(self, node):

FILE: typed_ast/ast3.py
  function parse (line 45) | def parse(source, filename='<unknown>', mode='exec', feature_version=LAT...
  function literal_eval (line 66) | def literal_eval(node_or_string):
  function dump (line 114) | def dump(node, annotate_fields=True, include_attributes=False):
  function copy_location (line 144) | def copy_location(new_node, old_node):
  function fix_missing_locations (line 156) | def fix_missing_locations(node):
  function increment_lineno (line 181) | def increment_lineno(node, n=1):
  function iter_fields (line 192) | def iter_fields(node):
  function iter_child_nodes (line 204) | def iter_child_nodes(node):
  function get_docstring (line 218) | def get_docstring(node, clean=True):
  function walk (line 241) | def walk(node):
  class NodeVisitor (line 255) | class NodeVisitor(object):
    method visit (line 275) | def visit(self, node):
    method generic_visit (line 281) | def generic_visit(self, node):
  class NodeTransformer (line 292) | class NodeTransformer(NodeVisitor):
    method generic_visit (line 328) | def generic_visit(self, node):

FILE: typed_ast/conversions.py
  function py2to3 (line 4) | def py2to3(ast):
  function _copy_attributes (line 23) | def _copy_attributes(new_value, old_value):
  class _AST2To3 (line 30) | class _AST2To3(ast27.NodeTransformer):
    method __init__ (line 32) | def __init__(self):
    method visit (line 35) | def visit(self, node):
    method maybe_visit (line 42) | def maybe_visit(self, node):
    method generic_visit (line 48) | def generic_visit(self, node):
    method visit_list (line 60) | def visit_list(self, l):
    method visit_FunctionDef (line 63) | def visit_FunctionDef(self, n):
    method visit_ClassDef (line 68) | def visit_ClassDef(self, n):
    method visit_TryExcept (line 73) | def visit_TryExcept(self, n):
    method visit_TryFinally (line 79) | def visit_TryFinally(self, n):
    method visit_ExceptHandler (line 91) | def visit_ExceptHandler(self, n):
    method visit_Print (line 103) | def visit_Print(self, n):
    method visit_Raise (line 117) | def visit_Raise(self, n):
    method visit_Exec (line 135) | def visit_Exec(self, n):
    method visit_Repr (line 149) | def visit_Repr(self, n):
    method visit_With (line 155) | def visit_With(self, n):
    method visit_Call (line 160) | def visit_Call(self, n):
    method visit_Ellipsis (line 174) | def visit_Ellipsis(self, n):
    method visit_arguments (line 178) | def visit_arguments(self, n):
    method visit_Str (line 218) | def visit_Str(self, s):
    method visit_Num (line 224) | def visit_Num(self, n):
Condensed preview — 95 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (1,604K chars).
[
  {
    "path": ".gitattributes",
    "chars": 868,
    "preview": "# Generated files\n# https://github.com/github/linguist#generated-code\nast3/Include/graminit.h          linguist-generate"
  },
  {
    "path": ".github/workflows/build.yml",
    "chars": 2597,
    "preview": "name: Build wheels\n\non: [push, pull_request, workflow_dispatch]\n\njobs:\n  build_wheels:\n    name: py${{ matrix.python-ver"
  },
  {
    "path": ".gitignore",
    "chars": 91,
    "preview": "*.o\n*.pyc\n/build/\n__pycache__/\n.DS_Store\n/tools/pgen3\n/.pytest_cache/\n/typed_ast.egg-info/\n"
  },
  {
    "path": "CONTRIBUTING.md",
    "chars": 172,
    "preview": "To contribute code to this project, you'll need to sign the [Python Software Foundation's Contributor License Agreement]"
  },
  {
    "path": "LICENSE",
    "chars": 15187,
    "preview": "Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/\nUpstream-Name: typed-ast\nSource: https://pypi"
  },
  {
    "path": "MANIFEST.in",
    "chars": 378,
    "preview": "include ast27/Grammar/Grammar\ninclude ast27/Parser/Python.asdl\nrecursive-include ast27 *.h\nrecursive-include ast27 *.py\n"
  },
  {
    "path": "README.md",
    "chars": 3763,
    "preview": "# End of life\n\nThis project is no longer maintained.\n\nUse the standard library `ast` module instead.\nSee https://github."
  },
  {
    "path": "ast27/Custom/typed_ast.c",
    "chars": 8986,
    "preview": "#include \"Python.h\"\n#include \"../Include/Python-ast.h\"\n#include \"../Include/compile.h\"\n#include \"../Include/node.h\"\n#inc"
  },
  {
    "path": "ast27/Grammar/Grammar",
    "chars": 6545,
    "preview": "# Grammar for Python\n\n# Note:  Changing the grammar specified in this file will most likely\n#        require correspondi"
  },
  {
    "path": "ast27/Include/Python-ast.h",
    "chars": 22386,
    "preview": "/* File automatically generated by Parser/asdl_c.py. */\n\n#include \"../Include/asdl.h\"\n\ntypedef struct _mod *mod_ty;\n\ntyp"
  },
  {
    "path": "ast27/Include/asdl.h",
    "chars": 1285,
    "preview": "#ifndef Ta27_ASDL_H\n#define Ta27_ASDL_H\n\n#include \"../Include/pyarena.h\"\n\ntypedef PyObject * identifier;\ntypedef PyObjec"
  },
  {
    "path": "ast27/Include/ast.h",
    "chars": 226,
    "preview": "#ifndef Ta27_AST_H\n#define Ta27_AST_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\nmod_ty Ta27AST_FromNode(const node *, PyCo"
  },
  {
    "path": "ast27/Include/bitset.h",
    "chars": 798,
    "preview": "\n#ifndef Ta27_BITSET_H\n#define Ta27_BITSET_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n/* Bitset interface */\n\n#define BYT"
  },
  {
    "path": "ast27/Include/compile.h",
    "chars": 269,
    "preview": "\n#ifndef Ta27_COMPILE_H\n#define Ta27_COMPILE_H\n\n#include \"Python.h\"\n\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n/* Public i"
  },
  {
    "path": "ast27/Include/errcode.h",
    "chars": 1369,
    "preview": "#ifndef Ta27_ERRCODE_H\n#define Ta27_ERRCODE_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n\n/* Error codes passed around betw"
  },
  {
    "path": "ast27/Include/graminit.h",
    "chars": 1988,
    "preview": "/* Generated by Parser/pgen */\n\n#define single_input 256\n#define file_input 257\n#define eval_input 258\n#define decorator"
  },
  {
    "path": "ast27/Include/grammar.h",
    "chars": 2032,
    "preview": "\n/* Grammar interface */\n\n#ifndef Ta27_GRAMMAR_H\n#define Ta27_GRAMMAR_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n#include"
  },
  {
    "path": "ast27/Include/node.h",
    "chars": 887,
    "preview": "\n/* Parse tree node interface */\n\n#ifndef Ta27_NODE_H\n#define Ta27_NODE_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\ntypede"
  },
  {
    "path": "ast27/Include/parsetok.h",
    "chars": 1845,
    "preview": "\n/* Parser-tokenizer link interface */\n\n#ifndef Ta27_PARSETOK_H\n#define Ta27_PARSETOK_H\n#ifdef __cplusplus\nextern \"C\" {\n"
  },
  {
    "path": "ast27/Include/pgenheaders.h",
    "chars": 348,
    "preview": "#ifndef DUMMY_Py_PGENHEADERS_H\n#define DUMMY_Py_PGENHEADERS_H\n\n/* pgenheaders.h is included by a bunch of files but noth"
  },
  {
    "path": "ast27/Include/pyarena.h",
    "chars": 364,
    "preview": "/* An arena-like memory interface for the compiler.\n */\n\n#ifndef Ta27_PYARENA_H\n#define Ta27_PYARENA_H\n\n#if PY_MINOR_VER"
  },
  {
    "path": "ast27/Include/pycore_pyarena.h",
    "chars": 2656,
    "preview": "/* An arena-like memory interface for the compiler.\n */\n\n#ifndef Ta27_INTERNAL_PYARENA_H\n#define Ta27_INTERNAL_PYARENA_H"
  },
  {
    "path": "ast27/Include/token.h",
    "chars": 1847,
    "preview": "\n/* Token types */\n\n#ifndef Ta27_TOKEN_H\n#define Ta27_TOKEN_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n#undef TILDE   /* "
  },
  {
    "path": "ast27/Parser/Python.asdl",
    "chars": 4839,
    "preview": "-- ASDL's five builtin types are identifier, int, string, object, bool\n\nmodule Python version \"$Revision$\"\n{\n\tmod = Modu"
  },
  {
    "path": "ast27/Parser/acceler.c",
    "chars": 3404,
    "preview": "\n/* Parser accelerator module */\n\n/* The parser as originally conceived had disappointing performance.\n   This module do"
  },
  {
    "path": "ast27/Parser/asdl.py",
    "chars": 11320,
    "preview": "\"\"\"An implementation of the Zephyr Abstract Syntax Definition Language.\n\nSee http://asdl.sourceforge.net/ and\nhttp://www"
  },
  {
    "path": "ast27/Parser/asdl_c.py",
    "chars": 41795,
    "preview": "#! /usr/bin/env python\n\"\"\"Generate C code from an ASDL description.\"\"\"\n\n# TO DO\n# handle fields that have a type but no "
  },
  {
    "path": "ast27/Parser/bitset.c",
    "chars": 1087,
    "preview": "\n/* Bitset primitives used by the parser generator */\n\n#include \"../Include/pgenheaders.h\"\n#include \"../Include/bitset.h"
  },
  {
    "path": "ast27/Parser/grammar.c",
    "chars": 6947,
    "preview": "\n/* Grammar implementation */\n\n#include \"Python.h\"\n#include \"../Include/pgenheaders.h\"\n\n#include <ctype.h>\n\n#include \".."
  },
  {
    "path": "ast27/Parser/grammar1.c",
    "chars": 1252,
    "preview": "\n/* Grammar subroutines needed by parser */\n\n#include \"Python.h\"\n#include \"../Include/pgenheaders.h\"\n#include \"../Includ"
  },
  {
    "path": "ast27/Parser/node.c",
    "chars": 4568,
    "preview": "/* Parse tree node implementation */\n\n#include \"Python.h\"\n#include \"../Include/node.h\"\n#include \"../Include/errcode.h\"\n\n"
  },
  {
    "path": "ast27/Parser/parser.c",
    "chars": 11502,
    "preview": "\n/* Parser implementation */\n\n/* For a description, see the comments at end of this file */\n\n/* XXX To do: error recover"
  },
  {
    "path": "ast27/Parser/parser.h",
    "chars": 1056,
    "preview": "#ifndef Ta27_PARSER_H\n#define Ta27_PARSER_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n\n/* Parser interface */\n\n#define MAX"
  },
  {
    "path": "ast27/Parser/parsetok.c",
    "chars": 11524,
    "preview": "\n/* Parser-tokenizer link implementation */\n\n#include \"../Include/pgenheaders.h\"\n#include \"tokenizer.h\"\n#include \"../Inc"
  },
  {
    "path": "ast27/Parser/spark.py",
    "chars": 26962,
    "preview": "#  Copyright (c) 1998-2002 John Aycock\n#\n#  Permission is hereby granted, free of charge, to any person obtaining\n#  a c"
  },
  {
    "path": "ast27/Parser/tokenizer.c",
    "chars": 52538,
    "preview": "\n/* Tokenizer implementation */\n\n#include \"Python.h\"\n#include \"../Include/pgenheaders.h\"\n\n#include <ctype.h>\n#include <a"
  },
  {
    "path": "ast27/Parser/tokenizer.h",
    "chars": 3024,
    "preview": "#ifndef Ta27_TOKENIZER_H\n#define Ta27_TOKENIZER_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n#include \"object.h\"\n\n/* Tokeni"
  },
  {
    "path": "ast27/Python/Python-ast.c",
    "chars": 280357,
    "preview": "/* File automatically generated by Parser/asdl_c.py. */\n\n\n/*\n   __version__ 82160.\n\n   This module must be committed sep"
  },
  {
    "path": "ast27/Python/asdl.c",
    "chars": 1449,
    "preview": "#include \"Python.h\"\n#include \"../Include/asdl.h\"\n\nasdl_seq *\n_Ta27_asdl_seq_new(Py_ssize_t size, PyArena *arena)\n{\n    a"
  },
  {
    "path": "ast27/Python/ast.c",
    "chars": 118111,
    "preview": "/*\n * This file includes functions to transform a concrete syntax tree (CST) to\n * an abstract syntax tree (AST).  The m"
  },
  {
    "path": "ast27/Python/graminit.c",
    "chars": 45695,
    "preview": "/* Generated by Parser/pgen */\n\n#include \"../Include/pgenheaders.h\"\n#include \"../Include/grammar.h\"\ngrammar _Ta27Parser_"
  },
  {
    "path": "ast27/Python/mystrtoul.c",
    "chars": 9483,
    "preview": "\n#include \"Python.h\"\n\n#if defined(__sgi) && defined(WITH_THREAD) && !defined(_SGI_MP_SOURCE)\n#define _SGI_MP_SOURCE\n#end"
  },
  {
    "path": "ast3/Custom/typed_ast.c",
    "chars": 10221,
    "preview": "#include \"Python.h\"\n#include \"../Include/Python-ast.h\"\n#include \"../Include/node.h\"\n#include \"../Include/grammar.h\"\n#inc"
  },
  {
    "path": "ast3/Grammar/Grammar",
    "chars": 7328,
    "preview": "# Grammar for Python\n\n# NOTE WELL: You should also follow all the steps listed at\n# https://devguide.python.org/grammar/"
  },
  {
    "path": "ast3/Include/Python-ast.h",
    "chars": 23443,
    "preview": "/* File automatically generated by Parser/asdl_c.py. */\n\n#include \"../Include/asdl.h\"\n\ntypedef struct _mod *mod_ty;\n\ntyp"
  },
  {
    "path": "ast3/Include/asdl.h",
    "chars": 1251,
    "preview": "#ifndef Ta3_ASDL_H\n#define Ta3_ASDL_H\n\n#include \"../Include/pyarena.h\"\n\ntypedef PyObject * identifier;\ntypedef PyObject "
  },
  {
    "path": "ast3/Include/ast.h",
    "chars": 677,
    "preview": "#ifndef Ta3_AST_H\n#define Ta3_AST_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\nextern int Ta3AST_Validate(mod_ty);\nextern m"
  },
  {
    "path": "ast3/Include/bitset.h",
    "chars": 813,
    "preview": "\n#ifndef Ta3_BITSET_H\n#define Ta3_BITSET_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n/* Bitset interface */\n\n#define BYTE "
  },
  {
    "path": "ast3/Include/errcode.h",
    "chars": 1698,
    "preview": "#ifndef Ta3_ERRCODE_H\n#define Ta3_ERRCODE_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n\n/* Error codes passed around betwee"
  },
  {
    "path": "ast3/Include/graminit.h",
    "chars": 2063,
    "preview": "/* Generated by Parser/pgen */\n\n#define single_input 256\n#define file_input 257\n#define eval_input 258\n#define decorator"
  },
  {
    "path": "ast3/Include/grammar.h",
    "chars": 2323,
    "preview": "\n/* Grammar interface */\n\n#ifndef Ta3_GRAMMAR_H\n#define Ta3_GRAMMAR_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n#include \""
  },
  {
    "path": "ast3/Include/node.h",
    "chars": 1095,
    "preview": "\n/* Parse tree node interface */\n\n#ifndef Ta3_NODE_H\n#define Ta3_NODE_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\ntypedef "
  },
  {
    "path": "ast3/Include/parsetok.h",
    "chars": 2899,
    "preview": "\n/* Parser-tokenizer link interface */\n#ifndef Py_LIMITED_API\n#ifndef Ta3_PARSETOK_H\n#define Ta3_PARSETOK_H\n#ifdef __cpl"
  },
  {
    "path": "ast3/Include/pgenheaders.h",
    "chars": 348,
    "preview": "#ifndef DUMMY_Py_PGENHEADERS_H\n#define DUMMY_Py_PGENHEADERS_H\n\n/* pgenheaders.h is included by a bunch of files but noth"
  },
  {
    "path": "ast3/Include/pyarena.h",
    "chars": 361,
    "preview": "/* An arena-like memory interface for the compiler.\n */\n\n#ifndef Ta3_PYARENA_H\n#define Ta3_PYARENA_H\n\n#if PY_MINOR_VERSI"
  },
  {
    "path": "ast3/Include/pycore_pyarena.h",
    "chars": 2653,
    "preview": "/* An arena-like memory interface for the compiler.\n */\n\n#ifndef Ta3_INTERNAL_PYARENA_H\n#define Ta3_INTERNAL_PYARENA_H\n#"
  },
  {
    "path": "ast3/Include/token.h",
    "chars": 2569,
    "preview": "\n/* Token types */\n#ifndef Py_LIMITED_API\n#ifndef Ta3_TOKEN_H\n#define Ta3_TOKEN_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif"
  },
  {
    "path": "ast3/Parser/Python.asdl",
    "chars": 5466,
    "preview": "-- ASDL's 7 builtin types are:\n-- identifier, int, string, bytes, object, singleton, constant\n--\n-- singleton: None, Tru"
  },
  {
    "path": "ast3/Parser/acceler.c",
    "chars": 3401,
    "preview": "\n/* Parser accelerator module */\n\n/* The parser as originally conceived had disappointing performance.\n   This module do"
  },
  {
    "path": "ast3/Parser/asdl.py",
    "chars": 12881,
    "preview": "#-------------------------------------------------------------------------------\n# Parser for ASDL [1] definition files."
  },
  {
    "path": "ast3/Parser/asdl_c.py",
    "chars": 45012,
    "preview": "#! /usr/bin/env python\n\"\"\"Generate C code from an ASDL description.\"\"\"\n\nimport os, sys\n\nimport asdl\n\nTABSIZE = 4\nMAX_COL"
  },
  {
    "path": "ast3/Parser/bitset.c",
    "chars": 1087,
    "preview": "\n/* Bitset primitives used by the parser generator */\n\n#include \"../Include/pgenheaders.h\"\n#include \"../Include/bitset.h"
  },
  {
    "path": "ast3/Parser/grammar.c",
    "chars": 7665,
    "preview": "\n/* Grammar implementation */\n\n#include \"Python.h\"\n#include \"../Include/pgenheaders.h\"\n\n#include <ctype.h>\n\n#include \".."
  },
  {
    "path": "ast3/Parser/grammar1.c",
    "chars": 1310,
    "preview": "\n/* Grammar subroutines needed by parser */\n\n#include \"Python.h\"\n#include \"../Include/pgenheaders.h\"\n#include \"../Includ"
  },
  {
    "path": "ast3/Parser/node.c",
    "chars": 4566,
    "preview": "/* Parse tree node implementation */\n\n#include \"Python.h\"\n#include \"../Include/node.h\"\n#include \"../Include/errcode.h\"\n\n"
  },
  {
    "path": "ast3/Parser/parser.c",
    "chars": 11577,
    "preview": "\n/* Parser implementation */\n\n/* For a description, see the comments at end of this file */\n\n/* XXX To do: error recover"
  },
  {
    "path": "ast3/Parser/parser.h",
    "chars": 1219,
    "preview": "#ifndef Ta3_PARSER_H\n#define Ta3_PARSER_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n\n/* Parser interface */\n\n#define MAXST"
  },
  {
    "path": "ast3/Parser/parsetok.c",
    "chars": 13272,
    "preview": "\n/* Parser-tokenizer link implementation */\n\n#include \"../Include/pgenheaders.h\"\n#include \"tokenizer.h\"\n#include \"../Inc"
  },
  {
    "path": "ast3/Parser/tokenizer.c",
    "chars": 58106,
    "preview": "\n/* Tokenizer implementation */\n\n#include \"Python.h\"\n#include \"../Include/pgenheaders.h\"\n\n#include <ctype.h>\n#include <a"
  },
  {
    "path": "ast3/Parser/tokenizer.h",
    "chars": 3592,
    "preview": "#ifndef Ta3_TOKENIZER_H\n#define Ta3_TOKENIZER_H\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n#include \"object.h\"\n\n/* Tokenize"
  },
  {
    "path": "ast3/Python/Python-ast.c",
    "chars": 282113,
    "preview": "/* File automatically generated by Parser/asdl_c.py. */\n\n#include <stddef.h>\n\n#include \"Python.h\"\n#include \"../Include/P"
  },
  {
    "path": "ast3/Python/asdl.c",
    "chars": 1435,
    "preview": "#include \"Python.h\"\n#include \"../Include/asdl.h\"\n\nasdl_seq *\n_Ta3_asdl_seq_new(Py_ssize_t size, PyArena *arena)\n{\n    as"
  },
  {
    "path": "ast3/Python/ast.c",
    "chars": 175783,
    "preview": "/*\n * This file includes functions to transform a concrete syntax tree (CST) to\n * an abstract syntax tree (AST). The ma"
  },
  {
    "path": "ast3/Python/graminit.c",
    "chars": 48037,
    "preview": "/* Generated by Parser/pgen */\n\n#include \"../Include/pgenheaders.h\"\n#include \"../Include/grammar.h\"\nextern grammar _Ta3P"
  },
  {
    "path": "ast3/tests/test_basics.py",
    "chars": 7514,
    "preview": "import os\n\nimport pytest\n\nfrom typed_ast import _ast3\nfrom typed_ast import _ast27\nimport typed_ast.conversions\n\n# Lowes"
  },
  {
    "path": "release_process.md",
    "chars": 1004,
    "preview": "# Typed AST PyPI Release Process\n0. Thoroughly test the prospective release.\n1. Make a commit titled \"Release version \\["
  },
  {
    "path": "setup.py",
    "chars": 4686,
    "preview": "import ast\nimport re\nimport sys\nif sys.version_info[0] < 3 or sys.version_info[1] < 3:\n    sys.exit('Error: typed_ast on"
  },
  {
    "path": "tools/Grammar.patch",
    "chars": 4114,
    "preview": "diff --git a/ast3/Grammar/Grammar b/ast3/Grammar/Grammar\nindex b139e9f..dfd730f 100644\n--- a/ast3/Grammar/Grammar\n+++ b/"
  },
  {
    "path": "tools/Python-asdl.patch",
    "chars": 2951,
    "preview": "diff --git a/ast3/Parser/Python.asdl b/ast3/Parser/Python.asdl\nindex f470ad1..7bde99c 100644\n--- a/ast3/Parser/Python.as"
  },
  {
    "path": "tools/asdl_c.patch",
    "chars": 4447,
    "preview": "--- /Users/guido/src/cpython37/Parser/asdl_c.py\t2018-09-10 08:18:23.000000000 -0700\n+++ ast3/Parser/asdl_c.py\t2019-01-15"
  },
  {
    "path": "tools/ast.patch",
    "chars": 16587,
    "preview": "diff --git a/ast3/Python/ast.c b/ast3/Python/ast.c\nindex e12f8e6..1fa762d 100644\n--- a/ast3/Python/ast.c\n+++ b/ast3/Pyth"
  },
  {
    "path": "tools/find_exported_symbols",
    "chars": 355,
    "preview": "#!/bin/bash\nPROJ_DIR=\"$( cd \"$( dirname \"${BASH_SOURCE[0]}\" )\" && pwd )/..\"\n\n# This requires GNU binutils (e.g. brew ins"
  },
  {
    "path": "tools/parsetok.patch",
    "chars": 2905,
    "preview": "diff --git a/ast3/Parser/parsetok.c b/ast3/Parser/parsetok.c\nindex 9f01a0d..5529feb 100644\n--- a/ast3/Parser/parsetok.c\n"
  },
  {
    "path": "tools/script",
    "chars": 1369,
    "preview": "#!/bin/bash -ex\n\n# Automate steps 1-4 of update_process.md (Mac).\n\nHERE=$(dirname ${BASH_SOURCE[0]})\ncd $HERE/..\npwd\n\nCP"
  },
  {
    "path": "tools/token.patch",
    "chars": 539,
    "preview": "diff --git a/ast3/Include/token.h b/ast3/Include/token.h\nindex a657fdd..d0b2b94 100644\n--- a/ast3/Include/token.h\n+++ b/"
  },
  {
    "path": "tools/tokenizer.patch",
    "chars": 2429,
    "preview": "diff --git a/ast3/Parser/tokenizer.c b/ast3/Parser/tokenizer.c\nindex 617a744..667fb4a 100644\n--- a/ast3/Parser/tokenizer"
  },
  {
    "path": "tools/update_ast27_asdl",
    "chars": 273,
    "preview": "#!/bin/bash -eux\n\n# Run after changing `Parser/Python.asdl`\n\nPROJ_DIR=\"$( cd \"$( dirname \"${BASH_SOURCE[0]}\" )\" && pwd )"
  },
  {
    "path": "tools/update_ast3_asdl",
    "chars": 291,
    "preview": "#!/bin/bash -eux\n\n# Run after changing `Parser/Python.asdl`\n\nPROJ_DIR=\"$( cd \"$( dirname \"${BASH_SOURCE[0]}\" )\" && pwd )"
  },
  {
    "path": "tools/update_ast3_grammar",
    "chars": 335,
    "preview": "#!/bin/bash -eux\n\n# Run after changing `Grammar/Grammar`\n\nPROJ_DIR=\"$( cd \"$( dirname \"${BASH_SOURCE[0]}\" )\" && pwd )/.."
  },
  {
    "path": "tools/update_exported_symbols",
    "chars": 451,
    "preview": "#!/bin/bash\nPROJ_DIR=\"$( cd \"$( dirname \"${BASH_SOURCE[0]}\" )\" && pwd )/..\"\n\nfor CHANGE in $( cat \"$PROJ_DIR/exported_sy"
  },
  {
    "path": "tools/update_header_guards",
    "chars": 346,
    "preview": "#!/bin/bash -eux\n\n# usage: ./update_header_guards VERSION_NUMBER\n\nPROJ_DIR=\"$( cd \"$( dirname \"${BASH_SOURCE[0]}\" )\" && "
  },
  {
    "path": "typed_ast/__init__.py",
    "chars": 22,
    "preview": "__version__ = \"1.5.5\"\n"
  },
  {
    "path": "typed_ast/ast27.py",
    "chars": 12630,
    "preview": "# -*- coding: utf-8 -*-\n\"\"\"\n    ast27\n    ~~~\n\n    The `ast27` module helps Python applications to process trees of the "
  },
  {
    "path": "typed_ast/ast3.py",
    "chars": 13761,
    "preview": "\"\"\"\n    typed_ast.ast3\n    ~~~\n\n    The `ast3` module helps Python applications to process trees of the Python\n    abstr"
  },
  {
    "path": "typed_ast/conversions.py",
    "chars": 8632,
    "preview": "from typed_ast import ast27\nfrom typed_ast import ast3\n\ndef py2to3(ast):\n    \"\"\"Converts a typed Python 2.7 ast to a typ"
  }
]

About this extraction

This page contains the full source code of the python/typed_ast GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 95 files (1.5 MB), approximately 409.2k tokens, and a symbol index with 1243 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!