Full Code of davidjamesca/ctypesgen for AI

master a90952dd6056 cached
70 files
711.8 KB
191.8k tokens
990 symbols
1 requests
Download .txt
Showing preview only (740K chars total). Download the full file or copy to clipboard to get everything.
Repository: davidjamesca/ctypesgen
Branch: master
Commit: a90952dd6056
Files: 70
Total size: 711.8 KB

Directory structure:
gitextract_dfid13aw/

├── .flake8
├── .github/
│   └── workflows/
│       ├── black.yml
│       ├── flake8.yml
│       ├── publish.yml
│       └── test.yml
├── .gitignore
├── .travis.yml
├── CHANGELOG.md
├── CONTRIBUTING
├── LICENSE
├── MANIFEST.in
├── README.md
├── ctypesgen/
│   ├── .gitignore
│   ├── __init__.py
│   ├── __main__.py
│   ├── ctypedescs.py
│   ├── descriptions.py
│   ├── expressions.py
│   ├── libraryloader.py
│   ├── messages.py
│   ├── options.py
│   ├── parser/
│   │   ├── .gitignore
│   │   ├── __init__.py
│   │   ├── cdeclarations.py
│   │   ├── cgrammar.py
│   │   ├── cparser.py
│   │   ├── ctypesparser.py
│   │   ├── datacollectingparser.py
│   │   ├── lex.py
│   │   ├── lextab.py
│   │   ├── parsetab.py
│   │   ├── pplexer.py
│   │   ├── preprocessor.py
│   │   └── yacc.py
│   ├── printer_json/
│   │   ├── __init__.py
│   │   └── printer.py
│   ├── printer_python/
│   │   ├── __init__.py
│   │   ├── defaultheader.py
│   │   ├── preamble.py
│   │   └── printer.py
│   ├── processor/
│   │   ├── __init__.py
│   │   ├── dependencies.py
│   │   ├── operations.py
│   │   └── pipeline.py
│   └── version.py
├── debian/
│   ├── .gitignore
│   ├── compat
│   ├── control
│   ├── ctypesgen.docs
│   ├── ctypesgen.manpages
│   ├── mk_changelog
│   ├── mk_manpage
│   └── rules
├── demo/
│   ├── .gitignore
│   ├── README.md
│   ├── demoapp.c
│   ├── demoapp.py
│   ├── demolib.c
│   ├── demolib.h
│   └── pydemolib.py
├── docs/
│   └── publishing.md
├── pyproject.toml
├── run.py
├── setup.py
├── tests/
│   ├── .gitignore
│   ├── __init__.py
│   ├── ctypesgentest.py
│   └── testsuite.py
├── todo.txt
└── tox.ini

================================================
FILE CONTENTS
================================================

================================================
FILE: .flake8
================================================
[flake8]
ignore =
    # whitespace before ':' (Black)
    E203,
    # line break before binary operator (Black)
    W503,

per-file-ignores =
    # Files and directories which need fixes or specific exceptions.
    #
    # Description of codes:
    # E401    multiple imports on one line
    # E501    line too long
    #
    ctypesgen/__init__.py: F401
    ctypesgen/parser/cgrammar.py: E501

max-line-length = 100

exclude =
    ctypesgen/parser/parsetab.py,
    ctypesgen/parser/lextab.py,
	ctypesgen/parser/yacc.py,
	ctypesgen/parser/lex.py,
    demo/pydemolib.py,
    .git,
    __pycache__,
    debian

builtins =
    _,


================================================
FILE: .github/workflows/black.yml
================================================
---
name: Python Black Formatting

on:
  - push
  - pull_request
  - fork

jobs:
  black:
    name: Black
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v3

      - name: Set up Python
        uses: actions/setup-python@v4
        with:
          python-version: '3.10'

      - name: Install
        run: |
          python -m pip install --upgrade pip
          pip install black==23.3.0

      - name: Run Black
        run: |
          black --check --diff setup.py run.py ctypesgen/ \
            --exclude='${{ env.EXCLUDE }}'
        env:
          EXCLUDE: ".*tab.py|ctypesgen/parser/cgrammar.py|\
            ctypesgen/parser/lex.py|ctypesgen/parser/yacc.py"


================================================
FILE: .github/workflows/flake8.yml
================================================
---
name: Python Flake8 Code Quality

on:
  - push
  - pull_request
  - fork

jobs:
  flake8:
    name: ${{ matrix.directory }}
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v3

      - name: Set up Python
        uses: actions/setup-python@v4
        with:
          python-version: 3.8

      - name: Install
        run: |
          python -m pip install --upgrade pip
          pip install flake8==3.8.4

      - name: Run Flake8
        run: |
          flake8 --count --statistics --show-source --jobs=$(nproc) .


================================================
FILE: .github/workflows/publish.yml
================================================
---
name: Publish Python distributions to PyPI

on:
  release:
    types: [published]
jobs:
  build-n-publish:
    name: Build and publish Python distributions to PyPI
    runs-on: ubuntu-latest
    steps:
      - name: Check out repository
        uses: actions/checkout@v3
        with:
          ref: ${{ github.ref }}

      - name: Set up Python 3.10
        uses: actions/setup-python@v3
        with:
          python-version: '3.10'

      - name: Install pypa/build
        run: python -m pip install build --user

      - name: Build a binary wheel and a source tarball
        run: python -m build

      - name: Publish distribution to GitHub
        uses: softprops/action-gh-release@v1
        with:
          files: |
              dist/*

      - name: Publish distribution to PyPI
        uses: pypa/gh-action-pypi-publish@release/v1
        with:
          password: ${{ secrets.PYPI_API_TOKEN }}


================================================
FILE: .github/workflows/test.yml
================================================
---
name: Test

on:
  - push
  - pull_request
  - fork

jobs:
  setup-and-test:
    name: Python-${{ matrix.python }} ${{ matrix.os }}
    runs-on: ${{ matrix.os }}
    strategy:
      matrix:
        include:
          # Linux
          - os: ubuntu-latest
            python: 3.7
          - os: ubuntu-latest
            python: 3.8
          - os: ubuntu-latest
            python: 3.9
          - os: ubuntu-latest
            python: '3.10'
          # macOS
          - os: macos-latest
            python: '3.10'
          # Windows
          - os: windows-latest
            python: '3.10'
      fail-fast: false

    steps:
      - uses: actions/checkout@v3

      - name: Set up Python
        uses: actions/setup-python@v4
        with:
          python-version: ${{ matrix.python }}

      - name: Install Python dependencies
        run: |
          python -m pip install --upgrade pip
          pip install pytest

      - name: Run Test
        run: |
          pytest -v --showlocals tests/testsuite.py


================================================
FILE: .gitignore
================================================
# precompiled python files
*.pyc

# generated by distutils
MANIFEST
dist/

# generated by setuptools
build/
ctypesgen.egg-info/
.eggs/
.python-version
tests/.python-version

# Swap/backup editor files
*.swp
*~
.pybuild/
.tox/
.idea/
core


================================================
FILE: .travis.yml
================================================
dist: bionic
language: python
python: 3.7.8

install:
  - pip install tox
script:
  - tox

stages:
  - name: tox
  - name: publish to test.pypi.org
    if: env(publish) = true AND type = api
    # to publish, go to https://travis-ci.com/Alan-R/ctypesgen/
    # click more options -> trigger build
    # in textbox enter this:
    # env:
    #   publish: true
  - name: verify
    if: env(publish) = true AND type = api

jobs:
  include:
    - stage: tox
      env: TOXENV=py37

    - stage: tox
      env: TOXENV=black

    - stage: publish to test.pypi.org
      install:
        - pip install --upgrade build twine
      script:
        - python -m build
        - python -m twine upload --repository-url https://test.pypi.org/legacy/ dist/*

    - stage: verify
      install:
        - pip install --index-url https://test.pypi.org/simple/ --no-deps ctypesgen
      script:
        - python -c 'import ctypesgen; print(ctypesgen.VERSION)'


================================================
FILE: CHANGELOG.md
================================================
## Change Log

### Unreleased

### v1.1.1

- Fixed inconsistency in version output in released packages

### v1.1.0

This release has a number of bug fixes in addition to a few new features.
Following a complete transition to Python 3, with dropped Python 2 support,
major work was made towards code modernization and quality.

- The code is now Black formatted and Flake8 tested
- Greatly improved unittest framework
- Embedded PLY version updated to 3.11
- New option: `--no-embed-preamble` create separate files for preamble and
  loader instead of embedding in each output file
- New option: `--allow-gnu-c` do not undefine `__GNUC__`
- Fixed library loader search path on macOS
- Fixed rare bug, processing (legacy) header files with MacRoman encoding
  on macOS
- Added full support for floating and integer constants
- Added support for sized integer types on Windows
- Added support to handle `restrict` and `_Noreturn` keywords
- Added name formats to posix library loader
- Fixed mapping of 'short int' to c_short
- Git tags are now using `x.y.z` format

### v1.0.2

Many issues fixed. Parse gcc attributes more

Implements automatic calling convention selection based on gcc attributes for
stdcall/cdecl.

- Simplify and unify library loader for various platforms. Improve library path
  searches on Linux (parsed ld.so.conf includes now).
- First implementaion of #pragma pack
- First implemenation of #undef
- Adds several command line options:
  `-D` `--define`
  `-U` `--undefine`
  `--no-undefs`
  `-P` `--strip-prefix`
  `--debug-level`

### v1.0.1

Fix handling of function prototypes 

Other minor improvments included.

### v1.0.0

Py2/Py3 support 

Various development branches merged back

In addition to the various developments from the different branches, this
tag also represents a code state that:

- ties in with Travis CI to watch code developments
- improves testsuite, including moving all JSON tests to testsuite
- includes a decent Debian package build configuration
- automatically creates a man page to be included in the Debian package


================================================
FILE: CONTRIBUTING
================================================
The best way to document a bug is to create a new test which demonstrates it. You should do that by adding a new test to:
    ctypesgen/test/testsuite.py
This is *required* for any patches you might provide. You must provide tests to demonstrate your bug fix or enhancement.

All patches will be have to pass unit tests. You can run the tests by running "tox" with no options.

All our Python code is formatted with using the "black" command with a 100 character line length.
    black --line-length 100
You can verify your patch formatting before you submit it by running "tox -e black".


================================================
FILE: LICENSE
================================================
Copyright (c) 2007-2022, Ctypesgen Developers
All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:

1. Redistributions of source code must retain the above copyright notice,
   this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
   notice, this list of conditions and the following disclaimer in the
   documentation and/or other materials provided with the distribution.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE.


================================================
FILE: MANIFEST.in
================================================
graft ctypesgen
recursive-exclude ctypesgen .gitignore
global-exclude *.py[cod]
include ctypesgen/VERSION


================================================
FILE: README.md
================================================
                              ctypesgen
                              ---------

                  (c) Ctypesgen developers 2007-2022
                 https://github.com/ctypesgen/ctypesgen

_ctypesgen_ is a pure-python ctypes wrapper generator. It parses C header files
and creates a wrapper for libraries based on what it finds.

Preprocessor macros are handled in a manner consistent with typical C code.
Preprocessor macro functions are translated into Python functions that are then
made available to the user of the newly-generated Python wrapper library.

It can also output JSON, which can be used with Mork, which generates bindings
for Lua, using the alien module (which binds libffi to Lua).

## Documentation

See https://github.com/ctypesgen/ctypesgen/wiki for full documentation.

Run `ctypesgen --help` for full range of available options.

## Installation

_ctypesgen_ can be installed by `pip install ctypesgen`. It requires Python 3.7
to run.

## Basic Usage

This project automatically generates ctypes wrappers for header files written
in C.

For example, if you'd like to generate Neon bindings, you can do so using this
recipe (using a standard pip install):

```sh
ctypesgen -lneon /usr/local/include/neon/ne_*.h -o neon.py
```

Some libraries, such as APR, need special flags to compile. You can pass these
flags in on the command line.

For example:

```sh
FLAGS = `apr-1-config --cppflags --includes`
ctypesgen $FLAGS -llibapr-1.so $HOME/include/apr-1/apr*.h -o apr.py
```

Sometimes, libraries will depend on each other. You can specify these
dependencies using -mmodule, where module is the name of the dependency module.

Here's an example for apr_util:

```sh
ctypesgen $FLAGS -llibaprutil-1.so $HOME/include/apr-1/ap[ru]*.h \
	-mapr -o apr_util.py
```

If you want JSON output (e.g. for generating Lua bindings), use

```
--output-language=json
```

When outputting JSON, you will probably also want to use

```
--all-headers --builtin-symbols --no-stddef-types --no-gnu-types
--no-python-types
```

## Related Software of Interest

_libffi_ is a portable Foreign Function Interface library:
http://sources.redhat.com/libffi/

_Mork_, the friendly alien, can be found at:
https://github.com/rrthomas/mork

## License

_ctypesgen_ is distributed under the New (2-clause) BSD License:
http://www.opensource.org/licenses/bsd-license.php


================================================
FILE: ctypesgen/.gitignore
================================================
VERSION


================================================
FILE: ctypesgen/__init__.py
================================================
"""
Ctypesgencore is the module that contains the main body of ctypesgen - in fact,
it contains everything but the command-line interface.

ctypesgen's job is divided into three steps:

Step 1: Parse

Ctypesgen reads the input header files and parses them. It generates a list of
function, variable, struct, union, enum, constant, typedef, and macro
descriptions from the input files. These descriptions are encapsulated as
ctypesgen.descriptions.Description objects.

The package ctypesgen.parser is responsible for the parsing stage.

Step 2: Process

Ctypesgen processes the list of descriptions from the parsing stage. This is
the stage where ctypesgen resolves name conflicts and filters descriptions using
the regexes specified on the command line. Other processing steps take place
at this stage, too. When processing is done, ctypesgen finalizes which
descriptions will be included in the output file.

The package ctypesgen.processor is responsible for the processing stage.

Step 3: Print

Ctypesgen writes the descriptions to the output file, along with a header.

The package ctypesgen.printer is responsible for the printing stage.

There are three modules in ctypesgen that describe the format that the
parser, processor, and printer modules use to pass information. They are:

* descriptions: Classes to represent the descriptions.

* ctypedecls: Classes to represent C types.

* expressions: Classes to represent an expression in a language-independent
format.
"""

__all__ = [
    "parser",
    "processor",
    "printer",
    "descriptions",
    "ctypedescs",
    "expressions",
    "messages",
    "options",
    "version",
]

# Workhorse modules
from . import parser
from . import processor
from . import printer_python
from . import version

# Modules describing internal format
from . import descriptions
from . import ctypedescs
from . import expressions

# Helper modules
from . import messages
from . import options

try:
    from . import printer_json
except ImportError:
    pass

__version__ = version.VERSION.partition("-")[-1]
VERSION = __version__

printer = printer_python  # Default the printer to generating Python


================================================
FILE: ctypesgen/__main__.py
================================================
"""
Command-line interface for ctypesgen
"""

import argparse

from ctypesgen import (
    messages as msgs,
    options as core_options,
    parser as core_parser,
    printer_python,
    printer_json,
    processor,
    version,
)


def find_names_in_modules(modules):
    names = set()
    for module in modules:
        try:
            mod = __import__(module)
        except Exception:
            pass
        else:
            names.update(dir(mod))
    return names


def main(givenargs=None):
    # TODO(geisserml) In the future, convert action="append" to nargs="*" - that's nicer to use

    parser = argparse.ArgumentParser()
    parser.add_argument(
        "--version",
        action="version",
        version=version.VERSION_NUMBER,
    )

    # Parameters
    parser.add_argument("headers", nargs="+", help="Sequence of header files")
    parser.add_argument(
        "-o",
        "--output",
        metavar="FILE",
        help="write wrapper to FILE [default stdout]",
    )
    parser.add_argument(
        "-l",
        "--library",
        dest="libraries",
        action="append",
        default=[],
        metavar="LIBRARY",
        help="link to LIBRARY",
    )
    parser.add_argument(
        "--include",
        dest="other_headers",
        action="append",
        default=[],
        metavar="HEADER",
        help="include system header HEADER (e.g. stdio.h or stdlib.h)",
    )
    parser.add_argument(
        "-m",
        "--module",
        "--link-module",
        action="append",
        dest="modules",
        metavar="MODULE",
        default=[],
        help="use symbols from Python module MODULE",
    )
    parser.add_argument(
        "-I",
        "--includedir",
        action="append",
        dest="include_search_paths",
        default=[],
        metavar="INCLUDEDIR",
        help="add INCLUDEDIR as a directory to search for headers",
    )
    parser.add_argument(
        "-L",
        "-R",
        "--rpath",
        "--libdir",
        action="append",
        dest="universal_libdirs",
        default=[],
        metavar="LIBDIR",
        help="Add LIBDIR to the search path (both compile-time and run-time)",
    )
    parser.add_argument(
        "--compile-libdir",
        action="append",
        dest="compile_libdirs",
        metavar="LIBDIR",
        default=[],
        help="Add LIBDIR to the compile-time library search path.",
    )
    parser.add_argument(
        "--runtime-libdir",
        action="append",
        dest="runtime_libdirs",
        metavar="LIBDIR",
        default=[],
        help="Add LIBDIR to the run-time library search path.",
    )
    parser.add_argument(
        "--no-embed-preamble",
        action="store_false",
        dest="embed_preamble",
        default=True,
        help="Do not embed preamble and loader in output file. "
        "Defining --output as a file and --output-language to "
        "Python is a prerequisite.",
    )

    # Parser options
    parser.add_argument(
        "--cpp",
        dest="cpp",
        default="gcc -E",
        help="The command to invoke the c preprocessor, including any "
        "necessary options (default: gcc -E)",
    )
    parser.add_argument(
        "--allow-gnu-c",
        action="store_true",
        dest="allow_gnu_c",
        default=False,
        help="Specify whether to undefine the '__GNUC__' macro, "
        "while invoking the C preprocessor.\n"
        "(default: False. i.e. ctypesgen adds an implicit undefine using '-U __GNUC__'.)\n"
        "Specify this flag to avoid ctypesgen undefining '__GNUC__' as shown above.",
    )
    parser.add_argument(
        "-D",
        "--define",
        action="append",
        dest="cpp_defines",
        metavar="MACRO",
        default=[],
        help="Add a definition to the preprocessor via commandline",
    )
    parser.add_argument(
        "-U",
        "--undefine",
        action="append",
        dest="cpp_undefines",
        metavar="NAME",
        default=[],
        help="Instruct the preprocessor to undefine the specified macro via commandline",
    )
    parser.add_argument(
        "--save-preprocessed-headers",
        metavar="FILENAME",
        dest="save_preprocessed_headers",
        default=None,
        help="Save the preprocessed headers to the specified FILENAME",
    )
    parser.add_argument(
        "--optimize-lexer",
        dest="optimize_lexer",
        action="store_true",
        default=False,
        help="Run the lexer in optimized mode.  This mode requires write "
        "access to lextab.py file stored within the ctypesgen package.",
    )

    # Processor options
    parser.add_argument(
        "-a",
        "--all-headers",
        action="store_true",
        dest="all_headers",
        default=False,
        help="include symbols from all headers, including system headers",
    )
    parser.add_argument(
        "--builtin-symbols",
        action="store_true",
        dest="builtin_symbols",
        default=False,
        help="include symbols automatically generated by the preprocessor",
    )
    parser.add_argument(
        "--no-macros",
        action="store_false",
        dest="include_macros",
        default=True,
        help="Don't output macros.",
    )
    parser.add_argument(
        "--no-undefs",
        action="store_false",
        dest="include_undefs",
        default=True,
        help="Do not remove macro definitions as per #undef directives",
    )
    parser.add_argument(
        "-i",
        "--include-symbols",
        action="append",
        dest="include_symbols",
        metavar="REGEXPR",
        default=[],
        help="Regular expression for symbols to always include.  Multiple "
        "instances of this option will be combined into a single expression "
        "doing something like '(expr1|expr2|expr3)'.",
    )
    parser.add_argument(
        "-x",
        "--exclude-symbols",
        action="append",
        dest="exclude_symbols",
        metavar="REGEXPR",
        default=[],
        help="Regular expression for symbols to exclude.  Multiple instances "
        "of this option will be combined into a single expression doing "
        "something like '(expr1|expr2|expr3)'.",
    )
    parser.add_argument(
        "--no-stddef-types",
        action="store_true",
        dest="no_stddef_types",
        default=False,
        help="Do not support extra C types from stddef.h",
    )
    parser.add_argument(
        "--no-gnu-types",
        action="store_true",
        dest="no_gnu_types",
        default=False,
        help="Do not support extra GNU C types",
    )
    parser.add_argument(
        "--no-python-types",
        action="store_true",
        dest="no_python_types",
        default=False,
        help="Do not support extra C types built in to Python",
    )
    parser.add_argument(
        "--no-load-library",
        action="store_true",
        dest="no_load_library",
        default=False,
        help="Do not try to load library during the processing",
    )

    # Printer options
    parser.add_argument(
        "--header-template",
        dest="header_template",
        default=None,
        metavar="TEMPLATE",
        help="Use TEMPLATE as the header template in the output file.",
    )
    parser.add_argument(
        "--strip-build-path",
        dest="strip_build_path",
        default=None,
        metavar="BUILD_PATH",
        help="Strip build path from header paths in the wrapper file.",
    )
    parser.add_argument(
        "--insert-file",
        dest="inserted_files",
        default=[],
        action="append",
        metavar="FILENAME",
        help="Add the contents of FILENAME to the end of the wrapper file.",
    )
    parser.add_argument(
        "--output-language",
        dest="output_language",
        metavar="LANGUAGE",
        default="py",
        choices=("py", "json"),
        help="Choose output language",
    )
    parser.add_argument(
        "-P",
        "--strip-prefix",
        dest="strip_prefixes",
        default=[],
        action="append",
        metavar="REGEXPR",
        help="Regular expression to match prefix to strip from all symbols.  "
        "Multiple instances of this option will be combined into a single "
        "expression doing something like '(expr1|expr2|expr3)'.",
    )

    # Error options
    parser.add_argument(
        "--all-errors",
        action="store_true",
        default=False,
        dest="show_all_errors",
        help="Display all warnings and errors even if they would not affect output.",
    )
    parser.add_argument(
        "--show-long-errors",
        action="store_true",
        default=False,
        dest="show_long_errors",
        help="Display long error messages instead of abbreviating error messages.",
    )
    parser.add_argument(
        "--no-macro-warnings",
        action="store_false",
        default=True,
        dest="show_macro_warnings",
        help="Do not print macro warnings.",
    )
    parser.add_argument(
        "--debug-level",
        dest="debug_level",
        default=0,
        type=int,
        help="Run ctypesgen with specified debug level (also applies to yacc parser)",
    )

    parser.set_defaults(**core_options.default_values)
    args = parser.parse_args(givenargs)

    # Important: don't use +=, it modifies the original list instead of
    # creating a new one. This can be problematic with repeated API calls.
    args.compile_libdirs = args.compile_libdirs + args.universal_libdirs
    args.runtime_libdirs = args.runtime_libdirs + args.universal_libdirs

    # Figure out what names will be defined by imported Python modules
    args.other_known_names = find_names_in_modules(args.modules)

    if len(args.libraries) == 0:
        msgs.warning_message("No libraries specified", cls="usage")

    # Fetch printer for the requested output language
    if args.output_language == "py":
        printer = printer_python.WrapperPrinter
    elif args.output_language == "json":
        printer = printer_json.WrapperPrinter
    else:
        assert False  # handled by argparse choices

    # Step 1: Parse
    descriptions = core_parser.parse(args.headers, args)

    # Step 2: Process
    processor.process(descriptions, args)

    # Step 3: Print
    printer(args.output, args, descriptions)

    msgs.status_message("Wrapping complete.")

    # Correct what may be a common mistake
    if descriptions.all == []:
        if not args.all_headers:
            msgs.warning_message(
                "There wasn't anything of use in the "
                "specified header file(s). Perhaps you meant to run with "
                "--all-headers to include objects from included sub-headers? ",
                cls="usage",
            )


if __name__ == "__main__":
    main()


================================================
FILE: ctypesgen/ctypedescs.py
================================================
"""
ctypesgen.ctypedescs contains classes to represent a C type. All of them
classes are subclasses of CtypesType.

Unlike in previous versions of ctypesgen, CtypesType and its subclasses are
completely independent of the parser module.

The most important method of CtypesType and its subclasses is the py_string
method. str(ctype) returns a string which, when evaluated in the wrapper
at runtime, results in a ctypes type object.

For example, a CtypesType
representing an array of four integers could be created using:

>>> ctype = CtypesArray(CtypesSimple("int",True,0),4)

str(ctype) would evaluate to "c_int * 4".
"""

__docformat__ = "restructuredtext"

ctypes_type_map = {
    # typename   signed  longs
    ("void", True, 0): "None",
    ("int", True, 0): "c_int",
    ("int", False, 0): "c_uint",
    ("int", True, 1): "c_long",
    ("int", False, 1): "c_ulong",
    ("char", True, 0): "c_char",
    ("char", False, 0): "c_ubyte",
    ("short", True, 0): "c_short",
    ("short", False, 0): "c_ushort",
    ("float", True, 0): "c_float",
    ("double", True, 0): "c_double",
    ("double", True, 1): "c_longdouble",
    ("int8_t", True, 0): "c_int8",
    ("__int8_t", True, 0): "c_int8",
    ("__int8", True, 0): "c_int8",
    ("int16_t", True, 0): "c_int16",
    ("__int16_t", True, 0): "c_int16",
    ("__int16", True, 0): "c_int16",
    ("int32_t", True, 0): "c_int32",
    ("__int32_t", True, 0): "c_int32",
    ("__int32", True, 0): "c_int32",
    ("int64_t", True, 0): "c_int64",
    ("__int64", True, 0): "c_int64",
    ("__int64_t", True, 0): "c_int64",
    ("uint8_t", False, 0): "c_uint8",
    ("__uint8", False, 0): "c_uint8",
    ("__uint8_t", False, 0): "c_uint8",
    ("uint16_t", False, 0): "c_uint16",
    ("__uint16", False, 0): "c_uint16",
    ("__uint16_t", False, 0): "c_uint16",
    ("uint32_t", False, 0): "c_uint32",
    ("__uint32", False, 0): "c_uint32",
    ("__uint32_t", False, 0): "c_uint32",
    ("uint64_t", False, 0): "c_uint64",
    ("__uint64", False, 0): "c_uint64",
    ("__uint64_t", False, 0): "c_uint64",
    ("_Bool", True, 0): "c_bool",
    ("bool", True, 0): "c_bool",
}

ctypes_type_map_python_builtin = {
    ("int", True, -1): "c_short",
    ("int", False, -1): "c_ushort",
    ("int", True, 2): "c_longlong",
    ("int", False, 2): "c_ulonglong",
    ("size_t", True, 0): "c_size_t",
    ("off64_t", True, 0): "c_int64",
    ("wchar_t", True, 0): "c_wchar",
    ("ptrdiff_t", True, 0): "c_ptrdiff_t",  # Requires definition in preamble
    ("ssize_t", True, 0): "c_ptrdiff_t",  # Requires definition in preamble
    ("va_list", True, 0): "c_void_p",
}


# This protocol is used for walking type trees.
class CtypesTypeVisitor(object):
    def visit_struct(self, struct):
        pass

    def visit_enum(self, enum):
        pass

    def visit_typedef(self, name):
        pass

    def visit_error(self, error, cls):
        pass

    def visit_identifier(self, identifier):
        # This one comes from inside ExpressionNodes. There may be
        # ExpressionNode objects in array count expressions.
        pass


def visit_type_and_collect_info(ctype):
    class Visitor(CtypesTypeVisitor):
        def visit_struct(self, struct):
            structs.append(struct)

        def visit_enum(self, enum):
            enums.append(enum)

        def visit_typedef(self, typedef):
            typedefs.append(typedef)

        def visit_error(self, error, cls):
            errors.append((error, cls))

        def visit_identifier(self, identifier):
            identifiers.append(identifier)

    structs = []
    enums = []
    typedefs = []
    errors = []
    identifiers = []
    v = Visitor()
    ctype.visit(v)
    return structs, enums, typedefs, errors, identifiers


# Remove one level of indirection from function pointer; needed for typedefs
# and function parameters.
def remove_function_pointer(t):
    if type(t) == CtypesPointer and type(t.destination) == CtypesFunction:
        return t.destination
    elif type(t) == CtypesPointer:
        t.destination = remove_function_pointer(t.destination)
        return t
    else:
        return t


class CtypesType(object):
    def __init__(self):
        super(CtypesType, self).__init__()
        self.errors = []

    def __repr__(self):
        return '<Ctype (%s) "%s">' % (type(self).__name__, self.py_string())

    def error(self, message, cls=None):
        self.errors.append((message, cls))

    def visit(self, visitor):
        for error, cls in self.errors:
            visitor.visit_error(error, cls)


class CtypesSimple(CtypesType):
    """Represents a builtin type, like "char" or "int"."""

    def __init__(self, name, signed, longs):
        super(CtypesSimple, self).__init__()
        self.name = name
        self.signed = signed
        self.longs = longs

    def py_string(self, ignore_can_be_ctype=None):
        return ctypes_type_map[(self.name, self.signed, self.longs)]


class CtypesSpecial(CtypesType):
    def __init__(self, name):
        super(CtypesSpecial, self).__init__()
        self.name = name

    def py_string(self, ignore_can_be_ctype=None):
        return self.name


class CtypesTypedef(CtypesType):
    """Represents a type defined by a typedef."""

    def __init__(self, name):
        super(CtypesTypedef, self).__init__()
        self.name = name

    def visit(self, visitor):
        if not self.errors:
            visitor.visit_typedef(self.name)
        super(CtypesTypedef, self).visit(visitor)

    def py_string(self, ignore_can_be_ctype=None):
        return self.name


class CtypesBitfield(CtypesType):
    def __init__(self, base, bitfield):
        super(CtypesBitfield, self).__init__()
        self.base = base
        self.bitfield = bitfield

    def visit(self, visitor):
        self.base.visit(visitor)
        super(CtypesBitfield, self).visit(visitor)

    def py_string(self, ignore_can_be_ctype=None):
        return self.base.py_string()


class CtypesPointer(CtypesType):
    def __init__(self, destination, qualifiers):
        super(CtypesPointer, self).__init__()
        self.destination = destination
        self.qualifiers = qualifiers

    def visit(self, visitor):
        if self.destination:
            self.destination.visit(visitor)
        super(CtypesPointer, self).visit(visitor)

    def py_string(self, ignore_can_be_ctype=None):
        return "POINTER(%s)" % self.destination.py_string()


class CtypesArray(CtypesType):
    def __init__(self, base, count):
        super(CtypesArray, self).__init__()
        self.base = base
        self.count = count

    def visit(self, visitor):
        self.base.visit(visitor)
        if self.count:
            self.count.visit(visitor)
        super(CtypesArray, self).visit(visitor)

    def py_string(self, ignore_can_be_ctype=None):
        if self.count is None:
            return "POINTER(%s)" % self.base.py_string()
        if type(self.base) == CtypesArray:
            return "(%s) * int(%s)" % (self.base.py_string(), self.count.py_string(False))
        else:
            return "%s * int(%s)" % (self.base.py_string(), self.count.py_string(False))


class CtypesNoErrorCheck(object):
    def py_string(self, ignore_can_be_ctype=None):
        return "None"

    def __bool__(self):
        return False

    __nonzero__ = __bool__


class CtypesPointerCast(object):
    def __init__(self, target):
        self.target = target

    def py_string(self, ignore_can_be_ctype=None):
        return "lambda v,*a : cast(v, {})".format(self.target.py_string())


class CtypesFunction(CtypesType):
    def __init__(self, restype, parameters, variadic, attrib=dict()):
        super(CtypesFunction, self).__init__()
        self.restype = restype
        self.errcheck = CtypesNoErrorCheck()

        # Don't allow POINTER(None) (c_void_p) as a restype... causes errors
        # when ctypes automagically returns it as an int.
        # Instead, convert to POINTER(c_void).  c_void is not a ctypes type,
        # you can make it any arbitrary type.
        if (
            type(self.restype) == CtypesPointer
            and type(self.restype.destination) == CtypesSimple
            and self.restype.destination.name == "void"
        ):
            # we will provide a means of converting this to a c_void_p
            self.restype = CtypesPointer(CtypesSpecial("c_ubyte"), ())
            self.errcheck = CtypesPointerCast(CtypesSpecial("c_void_p"))

        # Return "String" instead of "POINTER(c_char)"
        if self.restype.py_string() == "POINTER(c_char)":
            if "const" in self.restype.qualifiers:
                self.restype = CtypesSpecial("c_char_p")
            else:
                self.restype = CtypesSpecial("String")

        self.argtypes = [remove_function_pointer(p) for p in parameters]
        self.variadic = variadic
        self.attrib = attrib

    def visit(self, visitor):
        self.restype.visit(visitor)
        for a in self.argtypes:
            a.visit(visitor)
        super(CtypesFunction, self).visit(visitor)

    def py_string(self, ignore_can_be_ctype=None):
        return "CFUNCTYPE(UNCHECKED(%s), %s)" % (
            self.restype.py_string(),
            ", ".join([a.py_string() for a in self.argtypes]),
        )


last_tagnum = 0


def anonymous_struct_tagnum():
    global last_tagnum
    last_tagnum += 1
    return last_tagnum


def fmt_anonymous_struct_tag(num):
    return "anon_%d" % num


def anonymous_struct_tag():
    return fmt_anonymous_struct_tag(anonymous_struct_tagnum())


class CtypesStruct(CtypesType):
    def __init__(self, tag, attrib, variety, members, src=None):
        super(CtypesStruct, self).__init__()
        self.tag = tag
        self.attrib = attrib
        self.variety = variety  # "struct" or "union"
        self.members = members

        if type(self.tag) == int or not self.tag:
            if type(self.tag) == int:
                self.tag = fmt_anonymous_struct_tag(self.tag)
            else:
                self.tag = anonymous_struct_tag()
            self.anonymous = True
        else:
            self.anonymous = False

        if self.members is None:
            self.opaque = True
        else:
            self.opaque = False

        self.src = src

    def get_required_types(self):
        types = super(CtypesStruct, self).get_required_types()
        types.add((self.variety, self.tag))
        return types

    def visit(self, visitor):
        visitor.visit_struct(self)
        if not self.opaque:
            for name, ctype in self.members:
                ctype.visit(visitor)
        super(CtypesStruct, self).visit(visitor)

    def get_subtypes(self):
        if self.opaque:
            return set()
        else:
            return set([m[1] for m in self.members])

    def py_string(self, ignore_can_be_ctype=None):
        return "%s_%s" % (self.variety, self.tag)


last_tagnum = 0


def anonymous_enum_tag():
    global last_tagnum
    last_tagnum += 1
    return "anon_%d" % last_tagnum


class CtypesEnum(CtypesType):
    def __init__(self, tag, enumerators, src=None):
        super(CtypesEnum, self).__init__()
        self.tag = tag
        self.enumerators = enumerators

        if not self.tag:
            self.tag = anonymous_enum_tag()
            self.anonymous = True
        else:
            self.anonymous = False

        if self.enumerators is None:
            self.opaque = True
        else:
            self.opaque = False

        self.src = src

    def visit(self, visitor):
        visitor.visit_enum(self)
        super(CtypesEnum, self).visit(visitor)

    def py_string(self, ignore_can_be_ctype=None):
        return "enum_%s" % self.tag


================================================
FILE: ctypesgen/descriptions.py
================================================
"""
ctypesgen.descriptions contains classes to represent a description of a
struct, union, enum, function, constant, variable, or macro. All the
description classes are subclassed from an abstract base class, Description.
The descriptions module also contains a class, DescriptionCollection, to hold
lists of Description objects.
"""


class DescriptionCollection(object):
    """Represents a collection of Descriptions."""

    def __init__(
        self, constants, typedefs, structs, enums, functions, variables, macros, all, output_order
    ):
        self.constants = constants
        self.typedefs = typedefs
        self.structs = structs
        self.enums = enums
        self.functions = functions
        self.variables = variables
        self.macros = macros
        self.all = all
        self.output_order = output_order


class Description(object):
    """Represents a constant, typedef, struct, function, variable, enum,
    or macro description. Description is an abstract base class."""

    def __init__(self, src=None):
        super(Description, self).__init__()
        self.src = src  # A tuple of (filename, lineno)

        # If object will be included in output file. Values are "yes", "never",
        # and "if_needed".
        self.include_rule = "yes"

        # A word about requirements, and dependents:
        # If X requires Y, Y is in X.requirements.
        # If X is in Y.requirements, then Y is in X.dependents.
        self.requirements = set()
        self.dependents = set()

        # If the processor module finds a fatal error that prevents a
        # a description from being output, then it appends a string describing
        # the problem to 'errors'. If it finds a nonfatal error, it appends a
        # string to 'warnings'. If the description would have been output, then
        # the errors and warnings are printed.

        # If there is anything in 'errors' after processing is complete, the
        # description is not output.

        self.errors = []
        self.warnings = []

    def add_requirements(self, reqs):
        self.requirements = self.requirements.union(reqs)
        for req in reqs:
            req.dependents.add(self)

    def error(self, msg, cls=None):
        self.errors.append((msg, cls))

    def warning(self, msg, cls=None):
        self.warnings.append((msg, cls))

    def __repr__(self):
        return "<Description: %s>" % self.casual_name()

    def casual_name(self):
        """Return a name to show the user."""

    def py_name(self):
        """Return the name associated with this description in Python code."""

    def c_name(self):
        """Return the name associated with this description in C code."""


class ConstantDescription(Description):
    """Simple class to contain information about a constant."""

    def __init__(self, name, value, src=None):
        super(ConstantDescription, self).__init__(src)
        # Name of constant, a string
        self.name = name
        # Value of constant, as an ExpressionNode object
        self.value = value

    def casual_name(self):
        return 'Constant "%s"' % self.name

    def py_name(self):
        return self.name

    def c_name(self):
        return self.name


class TypedefDescription(Description):
    """Simple container class for a type definition."""

    def __init__(self, name, ctype, src=None):
        super(TypedefDescription, self).__init__(src)
        self.name = name  # Name, a string
        self.ctype = ctype  # The base type as a ctypedescs.CtypeType object

    def casual_name(self):
        return 'Typedef "%s"' % self.name

    def py_name(self):
        return self.name

    def c_name(self):
        return self.name


class StructDescription(Description):
    """Simple container class for a structure or union definition."""

    def __init__(self, tag, attrib, variety, members, opaque, ctype, src=None):
        super(StructDescription, self).__init__(src)
        # The name of the structure minus the "struct" or "union"
        self.tag = tag
        self.attrib = attrib
        # A string "struct" or "union"
        self.variety = variety
        # A list of pairs of (name,ctype)
        self.members = members
        # True if struct body was not specified in header file
        self.opaque = opaque
        # The original CtypeStruct that created the struct
        self.ctype = ctype

    def casual_name(self):
        return '%s "%s"' % (self.variety.capitalize(), self.tag)

    def py_name(self):
        return "%s_%s" % (self.variety, self.tag)

    def c_name(self):
        return "%s %s" % (self.variety, self.tag)


class EnumDescription(Description):
    """Simple container class for an enum definition."""

    def __init__(self, tag, members, ctype, src=None):
        super(EnumDescription, self).__init__(src)
        # The name of the enum, minus the "enum"
        self.tag = tag
        # A list of (name,value) pairs where value is a number
        self.members = members
        # The original CtypeEnum that created the enum
        self.ctype = ctype

    def casual_name(self):
        return 'Enum "%s"' % self.tag

    def py_name(self):
        return "enum_%s" % self.tag

    def c_name(self):
        return "enum %s" % self.tag


class FunctionDescription(Description):
    """Simple container class for a C function."""

    def __init__(self, name, restype, argtypes, errcheck, variadic, attrib, src):
        super(FunctionDescription, self).__init__(src)
        # Name, a string
        self.name = name
        # Name according to C - stored in case description is renamed
        self.cname = name
        # A ctype representing return type
        self.restype = restype
        # A list of ctypes representing the argument types
        self.argtypes = argtypes
        # An optional error checker/caster
        self.errcheck = errcheck
        # Does this function accept a variable number of arguments?
        self.variadic = variadic
        # The set of attributes applied to the function (e.g. stdcall)
        self.attrib = attrib

    def casual_name(self):
        return 'Function "%s"' % self.name

    def py_name(self):
        return self.name

    def c_name(self):
        return self.cname


class VariableDescription(Description):
    """Simple container class for a C variable declaration."""

    def __init__(self, name, ctype, src=None):
        super(VariableDescription, self).__init__(src)
        # Name, a string
        self.name = name
        # Name according to C - stored in case description is renamed
        self.cname = name
        # The type of the variable
        self.ctype = ctype

    def casual_name(self):
        return 'Variable "%s"' % self.name

    def py_name(self):
        return self.name

    def c_name(self):
        return self.cname


class MacroDescription(Description):
    """Simple container class for a C macro."""

    def __init__(self, name, params, expr, src=None):
        super(MacroDescription, self).__init__(src)
        self.name = name
        self.params = params
        self.expr = expr  # ExpressionNode for the macro's body

    def casual_name(self):
        return 'Macro "%s"' % self.name

    def py_name(self):
        return self.name

    def c_name(self):
        return self.name


class UndefDescription(Description):
    """Simple container class for a preprocessor #undef directive."""

    def __init__(self, macro, src=None):
        super(UndefDescription, self).__init__(src)
        self.include_rule = "if_needed"

        self.macro = macro

    def casual_name(self):
        return 'Undef "%s"' % self.macro.name

    def py_name(self):
        return "#undef:%s" % self.macro.name

    def c_name(self):
        return "#undef %s" % self.macro.name


================================================
FILE: ctypesgen/expressions.py
================================================
"""
The expressions module contains classes to represent an expression. The main
class is ExpressionNode. ExpressionNode's most useful method is py_string(),
which returns a Python string representing that expression.
"""

import warnings
import keyword

from ctypesgen.ctypedescs import (
    CtypesPointer,
    CtypesSimple,
    CtypesStruct,
    CtypesType,
)

# Right now, the objects in this module are all oriented toward evaluation.
# However, they don't have to be, since ctypes objects are mutable. For example,
# shouldn't it be possible to translate the macro:
#
#   #define INCREMENT(x) ++x
#
# into Python? The resulting code should be:
#
#   def INCREMENT(x):
#       x.value+=1
#       return x.value
#
# On the other hand, this would be a challenge to write.


class EvaluationContext(object):
    """Interface for evaluating expression nodes."""

    def evaluate_identifier(self, name):
        warnings.warn('Attempt to evaluate identifier "%s" failed' % name)
        return 0

    def evaluate_sizeof(self, object):
        warnings.warn('Attempt to evaluate sizeof object "%s" failed' % str(object))
        return 0

    def evaluate_parameter(self, name):
        warnings.warn('Attempt to evaluate parameter "%s" failed' % name)
        return 0


class ExpressionNode(object):
    def __init__(self):
        self.errors = []

    def error(self, message, cls=None):
        self.errors.append((message, cls))

    def __repr__(self):
        try:
            string = repr(self.py_string(True))
        except ValueError:
            string = "<error in expression node>"
        return "<%s: %s>" % (type(self).__name__, string)

    def visit(self, visitor):
        for error, cls in self.errors:
            visitor.visit_error(error, cls)


class ConstantExpressionNode(ExpressionNode):
    def __init__(self, value, is_literal=False):
        ExpressionNode.__init__(self)
        self.value = value
        self.is_literal = is_literal

    def evaluate(self, context):
        return self.value

    def py_string(self, can_be_ctype):
        if self.is_literal:
            return self.value
        if self.value == float("inf"):
            return "float('inf')"
        elif self.value == float("-inf"):
            return "float('-inf')"
        return repr(self.value)


class IdentifierExpressionNode(ExpressionNode):
    def __init__(self, name):
        ExpressionNode.__init__(self)
        self.name = name

    def evaluate(self, context):
        return context.evaluate_identifier(self.name)

    def visit(self, visitor):
        visitor.visit_identifier(self.name)
        ExpressionNode.visit(self, visitor)

    def py_string(self, can_be_ctype):
        # Errors will be thrown in generated code if identifier evaluates
        # to a ctypes object, and can_be_ctype is False.
        return self.name


class ParameterExpressionNode(ExpressionNode):
    def __init__(self, name):
        ExpressionNode.__init__(self)
        self.name = name

    def evaluate(self, context):
        return context.evaluate_parameter(self.name)

    def visit(self, visitor):
        ExpressionNode.visit(self, visitor)

    def py_string(self, can_be_ctype):
        # Errors will be thrown in generated code if parameter is
        # a ctypes object, and can_be_ctype is False.
        return self.name


class UnaryExpressionNode(ExpressionNode):
    def __init__(self, name, op, format, child_can_be_ctype, child):
        ExpressionNode.__init__(self)
        self.name = name
        self.op = op
        self.format = format
        self.child_can_be_ctype = child_can_be_ctype
        self.child = child

    def visit(self, visitor):
        self.child.visit(visitor)
        ExpressionNode.visit(self, visitor)

    def evaluate(self, context):
        if self.op:
            return self.op(self.child.evaluate(context))
        else:
            raise ValueError('The C operator "%s" can\'t be evaluated right now' % self.name)

    def py_string(self, can_be_ctype):
        return self.format % self.child.py_string(self.child_can_be_ctype and can_be_ctype)


class SizeOfExpressionNode(ExpressionNode):
    def __init__(self, child):
        ExpressionNode.__init__(self)
        self.child = child

    def visit(self, visitor):
        self.child.visit(visitor)
        ExpressionNode.visit(self, visitor)

    def evaluate(self, context):
        if isinstance(self.child, CtypesType):
            return context.evaluate_sizeof(self.child)
        else:
            return context.evaluate_sizeof_object(self.child)

    def py_string(self, can_be_ctype):
        if isinstance(self.child, CtypesType):
            return "sizeof(%s)" % self.child.py_string()
        else:
            return "sizeof(%s)" % self.child.py_string(True)


class BinaryExpressionNode(ExpressionNode):
    def __init__(self, name, op, format, can_be_ctype, left, right):
        ExpressionNode.__init__(self)
        self.name = name
        self.op = op
        self.format = format
        self.can_be_ctype = can_be_ctype
        self.left = left
        self.right = right

    def visit(self, visitor):
        self.left.visit(visitor)
        self.right.visit(visitor)
        ExpressionNode.visit(self, visitor)

    def evaluate(self, context):
        if self.op:
            return self.op(self.left.evaluate(context), self.right.evaluate(context))
        else:
            raise ValueError('The C operator "%s" can\'t be evaluated right now' % self.name)

    def py_string(self, can_be_ctype):
        return self.format % (
            self.left.py_string(self.can_be_ctype[0] and can_be_ctype),
            self.right.py_string(self.can_be_ctype[0] and can_be_ctype),
        )


class ConditionalExpressionNode(ExpressionNode):
    def __init__(self, cond, yes, no):
        ExpressionNode.__init__(self)
        self.cond = cond
        self.yes = yes
        self.no = no

    def visit(self, visitor):
        self.cond.visit(visitor)
        self.yes.visit(visitor)
        self.no.visit(visitor)
        ExpressionNode.visit(self, visitor)

    def evaluate(self, context):
        if self.cond.evaluate(context):
            return self.yes.evaluate(context)
        else:
            return self.no.evaluate(context)

    def py_string(self, can_be_ctype):
        return "%s and %s or %s" % (
            self.cond.py_string(True),
            self.yes.py_string(can_be_ctype),
            self.no.py_string(can_be_ctype),
        )


class AttributeExpressionNode(ExpressionNode):
    def __init__(self, op, format, base, attribute):
        ExpressionNode.__init__(self)
        self.op = op
        self.format = format
        self.base = base
        self.attribute = attribute

        # Attribute access will raise parse errors if you don't do this.
        # Fortunately, the processor module does the same thing to
        # the struct member name.
        if self.attribute in keyword.kwlist:
            self.attribute = "_" + self.attribute

    def visit(self, visitor):
        self.base.visit(visitor)
        ExpressionNode.visit(self, visitor)

    def evaluate(self, context):
        return self.op(self.base.evaluate(context), self.attribute)

    def py_string(self, can_be_ctype):
        if can_be_ctype:
            return self.format % (self.base.py_string(can_be_ctype), self.attribute)
        else:
            return "(%s.value)" % (
                self.format % (self.base.py_string(can_be_ctype), self.attribute)
            )


class CallExpressionNode(ExpressionNode):
    def __init__(self, function, arguments):
        ExpressionNode.__init__(self)
        self.function = function
        self.arguments = arguments

    def visit(self, visitor):
        self.function.visit(visitor)
        for arg in self.arguments:
            arg.visit(visitor)
        ExpressionNode.visit(self, visitor)

    def evaluate(self, context):
        arguments = [arg.evaluate(context) for arg in self.arguments]
        return self.function.evaluate(context)(*arguments)

    def py_string(self, can_be_ctype):
        function = self.function.py_string(can_be_ctype)
        arguments = [x.py_string(can_be_ctype) for x in self.arguments]
        return "(%s (%s))" % (function, ", ".join(arguments))


class TypeCastExpressionNode(ExpressionNode):
    """
    Type cast expressions as handled by ctypesgen.  There is a strong
    possibility that this does not support all types of casts.
    """

    def __init__(self, base, ctype):
        ExpressionNode.__init__(self)
        self.base = base
        self.ctype = ctype

    def visit(self, visitor):
        self.base.visit(visitor)
        self.ctype.visit(visitor)
        ExpressionNode.visit(self, visitor)

    def evaluate(self, context):
        return self.base.evaluate(context)

    def py_string(self, can_be_ctype):
        if isinstance(self.ctype, CtypesPointer):
            return "cast({}, {})".format(self.base.py_string(True), self.ctype.py_string())
        elif isinstance(self.ctype, CtypesStruct):
            raise TypeError(
                "conversion to non-scalar type ({}) requested from {}".format(
                    self.ctype, self.base.py_string(False)
                )
            )
        else:
            # In reality, this conversion should only really work if the types
            # are scalar types.  We won't work really hard to test if the types
            # are  indeed scalar.
            # To be backwards compatible, we always return literals for builtin types.
            # We use a function to convert to integer for c_char types since
            # c_char can take integer or byte types, but the others can *only*
            # take non-char arguments.
            # ord_if_char must be provided by preambles
            if isinstance(self.ctype, CtypesSimple) and (
                self.ctype.name,
                self.ctype.signed,
            ) == (
                "char",
                True,
            ):
                ord_if_char = ""
            elif isinstance(self.ctype, CtypesSimple) and self.ctype.name == "void":
                # This is a very simple type cast:  cast everything to (void)
                # At least one macro from mingw does this
                return "None"
            else:
                ord_if_char = "ord_if_char"

            return "({to} ({ord_if_char}({frm}))).value".format(
                to=self.ctype.py_string(),
                ord_if_char=ord_if_char,
                frm=self.base.py_string(False),
            )


class UnsupportedExpressionNode(ExpressionNode):
    def __init__(self, message):
        ExpressionNode.__init__(self)
        self.message = message
        self.error(message, "unsupported-type")

    def evaluate(self, context):
        raise ValueError("Tried to evaluate an unsupported expression " "node: %s" % self.message)

    def __repr__(self):
        return "<UnsupportedExpressionNode>"

    def py_string(self, can_be_ctype):
        raise ValueError("Called py_string() an unsupported expression " "node: %s" % self.message)


================================================
FILE: ctypesgen/libraryloader.py
================================================
"""
Load libraries - appropriately for all our supported platforms
"""
# ----------------------------------------------------------------------------
# Copyright (c) 2008 David James
# Copyright (c) 2006-2008 Alex Holkner
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions
# are met:
#
#  * Redistributions of source code must retain the above copyright
#    notice, this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright
#    notice, this list of conditions and the following disclaimer in
#    the documentation and/or other materials provided with the
#    distribution.
#  * Neither the name of pyglet nor the names of its
#    contributors may be used to endorse or promote products
#    derived from this software without specific prior written
#    permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
# FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
# COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
# BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
# LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
# ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
# ----------------------------------------------------------------------------

import ctypes
import ctypes.util
import glob
import os.path
import platform
import re
import sys


def _environ_path(name):
    """Split an environment variable into a path-like list elements"""
    if name in os.environ:
        return os.environ[name].split(":")
    return []


class LibraryLoader:
    """
    A base class For loading of libraries ;-)
    Subclasses load libraries for specific platforms.
    """

    # library names formatted specifically for platforms
    name_formats = ["%s"]

    class Lookup:
        """Looking up calling conventions for a platform"""

        mode = ctypes.DEFAULT_MODE

        def __init__(self, path):
            super(LibraryLoader.Lookup, self).__init__()
            self.access = dict(cdecl=ctypes.CDLL(path, self.mode))

        def get(self, name, calling_convention="cdecl"):
            """Return the given name according to the selected calling convention"""
            if calling_convention not in self.access:
                raise LookupError(
                    "Unknown calling convention '{}' for function '{}'".format(
                        calling_convention, name
                    )
                )
            return getattr(self.access[calling_convention], name)

        def has(self, name, calling_convention="cdecl"):
            """Return True if this given calling convention finds the given 'name'"""
            if calling_convention not in self.access:
                return False
            return hasattr(self.access[calling_convention], name)

        def __getattr__(self, name):
            return getattr(self.access["cdecl"], name)

    def __init__(self):
        self.other_dirs = []

    def __call__(self, libname):
        """Given the name of a library, load it."""
        paths = self.getpaths(libname)

        for path in paths:
            # noinspection PyBroadException
            try:
                return self.Lookup(path)
            except Exception:  # pylint: disable=broad-except
                pass

        raise ImportError("Could not load %s." % libname)

    def getpaths(self, libname):
        """Return a list of paths where the library might be found."""
        if os.path.isabs(libname):
            yield libname
        else:
            # search through a prioritized series of locations for the library

            # we first search any specific directories identified by user
            for dir_i in self.other_dirs:
                for fmt in self.name_formats:
                    # dir_i should be absolute already
                    yield os.path.join(dir_i, fmt % libname)

            # check if this code is even stored in a physical file
            try:
                this_file = __file__
            except NameError:
                this_file = None

            # then we search the directory where the generated python interface is stored
            if this_file is not None:
                for fmt in self.name_formats:
                    yield os.path.abspath(os.path.join(os.path.dirname(__file__), fmt % libname))

            # now, use the ctypes tools to try to find the library
            for fmt in self.name_formats:
                path = ctypes.util.find_library(fmt % libname)
                if path:
                    yield path

            # then we search all paths identified as platform-specific lib paths
            for path in self.getplatformpaths(libname):
                yield path

            # Finally, we'll try the users current working directory
            for fmt in self.name_formats:
                yield os.path.abspath(os.path.join(os.path.curdir, fmt % libname))

    def getplatformpaths(self, _libname):  # pylint: disable=no-self-use
        """Return all the library paths available in this platform"""
        return []


# Darwin (Mac OS X)


class DarwinLibraryLoader(LibraryLoader):
    """Library loader for MacOS"""

    name_formats = [
        "lib%s.dylib",
        "lib%s.so",
        "lib%s.bundle",
        "%s.dylib",
        "%s.so",
        "%s.bundle",
        "%s",
    ]

    class Lookup(LibraryLoader.Lookup):
        """
        Looking up library files for this platform (Darwin aka MacOS)
        """

        # Darwin requires dlopen to be called with mode RTLD_GLOBAL instead
        # of the default RTLD_LOCAL.  Without this, you end up with
        # libraries not being loadable, resulting in "Symbol not found"
        # errors
        mode = ctypes.RTLD_GLOBAL

    def getplatformpaths(self, libname):
        if os.path.pathsep in libname:
            names = [libname]
        else:
            names = [fmt % libname for fmt in self.name_formats]

        for directory in self.getdirs(libname):
            for name in names:
                yield os.path.join(directory, name)

    @staticmethod
    def getdirs(libname):
        """Implements the dylib search as specified in Apple documentation:

        http://developer.apple.com/documentation/DeveloperTools/Conceptual/
            DynamicLibraries/Articles/DynamicLibraryUsageGuidelines.html

        Before commencing the standard search, the method first checks
        the bundle's ``Frameworks`` directory if the application is running
        within a bundle (OS X .app).
        """

        dyld_fallback_library_path = _environ_path("DYLD_FALLBACK_LIBRARY_PATH")
        if not dyld_fallback_library_path:
            dyld_fallback_library_path = [
                os.path.expanduser("~/lib"),
                "/usr/local/lib",
                "/usr/lib",
            ]

        dirs = []

        if "/" in libname:
            dirs.extend(_environ_path("DYLD_LIBRARY_PATH"))
        else:
            dirs.extend(_environ_path("LD_LIBRARY_PATH"))
            dirs.extend(_environ_path("DYLD_LIBRARY_PATH"))
            dirs.extend(_environ_path("LD_RUN_PATH"))

        if hasattr(sys, "frozen") and getattr(sys, "frozen") == "macosx_app":
            dirs.append(os.path.join(os.environ["RESOURCEPATH"], "..", "Frameworks"))

        dirs.extend(dyld_fallback_library_path)

        return dirs


# Posix


class PosixLibraryLoader(LibraryLoader):
    """Library loader for POSIX-like systems (including Linux)"""

    _ld_so_cache = None

    _include = re.compile(r"^\s*include\s+(?P<pattern>.*)")

    name_formats = ["lib%s.so", "%s.so", "%s"]

    class _Directories(dict):
        """Deal with directories"""

        def __init__(self):
            dict.__init__(self)
            self.order = 0

        def add(self, directory):
            """Add a directory to our current set of directories"""
            if len(directory) > 1:
                directory = directory.rstrip(os.path.sep)
            # only adds and updates order if exists and not already in set
            if not os.path.exists(directory):
                return
            order = self.setdefault(directory, self.order)
            if order == self.order:
                self.order += 1

        def extend(self, directories):
            """Add a list of directories to our set"""
            for a_dir in directories:
                self.add(a_dir)

        def ordered(self):
            """Sort the list of directories"""
            return (i[0] for i in sorted(self.items(), key=lambda d: d[1]))

    def _get_ld_so_conf_dirs(self, conf, dirs):
        """
        Recursive function to help parse all ld.so.conf files, including proper
        handling of the `include` directive.
        """

        try:
            with open(conf) as fileobj:
                for dirname in fileobj:
                    dirname = dirname.strip()
                    if not dirname:
                        continue

                    match = self._include.match(dirname)
                    if not match:
                        dirs.add(dirname)
                    else:
                        for dir2 in glob.glob(match.group("pattern")):
                            self._get_ld_so_conf_dirs(dir2, dirs)
        except IOError:
            pass

    def _create_ld_so_cache(self):
        # Recreate search path followed by ld.so.  This is going to be
        # slow to build, and incorrect (ld.so uses ld.so.cache, which may
        # not be up-to-date).  Used only as fallback for distros without
        # /sbin/ldconfig.
        #
        # We assume the DT_RPATH and DT_RUNPATH binary sections are omitted.

        directories = self._Directories()
        for name in (
            "LD_LIBRARY_PATH",
            "SHLIB_PATH",  # HP-UX
            "LIBPATH",  # OS/2, AIX
            "LIBRARY_PATH",  # BE/OS
        ):
            if name in os.environ:
                directories.extend(os.environ[name].split(os.pathsep))

        self._get_ld_so_conf_dirs("/etc/ld.so.conf", directories)

        bitage = platform.architecture()[0]

        unix_lib_dirs_list = []
        if bitage.startswith("64"):
            # prefer 64 bit if that is our arch
            unix_lib_dirs_list += ["/lib64", "/usr/lib64"]

        # must include standard libs, since those paths are also used by 64 bit
        # installs
        unix_lib_dirs_list += ["/lib", "/usr/lib"]
        if sys.platform.startswith("linux"):
            # Try and support multiarch work in Ubuntu
            # https://wiki.ubuntu.com/MultiarchSpec
            if bitage.startswith("32"):
                # Assume Intel/AMD x86 compat
                unix_lib_dirs_list += ["/lib/i386-linux-gnu", "/usr/lib/i386-linux-gnu"]
            elif bitage.startswith("64"):
                # Assume Intel/AMD x86 compatible
                unix_lib_dirs_list += [
                    "/lib/x86_64-linux-gnu",
                    "/usr/lib/x86_64-linux-gnu",
                ]
            else:
                # guess...
                unix_lib_dirs_list += glob.glob("/lib/*linux-gnu")
        directories.extend(unix_lib_dirs_list)

        cache = {}
        lib_re = re.compile(r"lib(.*)\.s[ol]")
        # ext_re = re.compile(r"\.s[ol]$")
        for our_dir in directories.ordered():
            try:
                for path in glob.glob("%s/*.s[ol]*" % our_dir):
                    file = os.path.basename(path)

                    # Index by filename
                    cache_i = cache.setdefault(file, set())
                    cache_i.add(path)

                    # Index by library name
                    match = lib_re.match(file)
                    if match:
                        library = match.group(1)
                        cache_i = cache.setdefault(library, set())
                        cache_i.add(path)
            except OSError:
                pass

        self._ld_so_cache = cache

    def getplatformpaths(self, libname):
        if self._ld_so_cache is None:
            self._create_ld_so_cache()

        result = self._ld_so_cache.get(libname, set())
        for i in result:
            # we iterate through all found paths for library, since we may have
            # actually found multiple architectures or other library types that
            # may not load
            yield i


# Windows


class WindowsLibraryLoader(LibraryLoader):
    """Library loader for Microsoft Windows"""

    name_formats = ["%s.dll", "lib%s.dll", "%slib.dll", "%s"]

    class Lookup(LibraryLoader.Lookup):
        """Lookup class for Windows libraries..."""

        def __init__(self, path):
            super(WindowsLibraryLoader.Lookup, self).__init__(path)
            self.access["stdcall"] = ctypes.windll.LoadLibrary(path)


# Platform switching

# If your value of sys.platform does not appear in this dict, please contact
# the Ctypesgen maintainers.

loaderclass = {
    "darwin": DarwinLibraryLoader,
    "cygwin": WindowsLibraryLoader,
    "win32": WindowsLibraryLoader,
    "msys": WindowsLibraryLoader,
}

load_library = loaderclass.get(sys.platform, PosixLibraryLoader)()


def add_library_search_dirs(other_dirs):
    """
    Add libraries to search paths.
    If library paths are relative, convert them to absolute with respect to this
    file's directory
    """
    for path in other_dirs:
        if not os.path.isabs(path):
            path = os.path.abspath(path)
        load_library.other_dirs.append(path)


del loaderclass


================================================
FILE: ctypesgen/messages.py
================================================
"""
ctypesgen.messages contains functions to display status, error, or warning
messages to the user. Warning and error messages are also associated
with a "message class", which is a string, which currently has no effect.

Error classes are:
'usage' - there was something funny about the command-line parameters
'cparser' - there was a syntax error in the header file
'missing-library' - a library could not be loaded
'macro' - a macro could not be translated to Python
'unsupported-type' - there was a type in the header that ctypes cannot use, like
    "long double".
'other' - catchall.

Warning classes are:
'usage' - there was something funny about the command-line parameters
'rename' - a description has been renamed to avoid a name conflict
'other' - catchall.
"""

import logging

__all__ = ["error_message", "warning_message", "status_message"]

log = logging.getLogger("ctypesgen")
ch = logging.StreamHandler()  # use stdio
logging_fmt_str = "%(levelname)s: %(message)s"
formatter = logging.Formatter(logging_fmt_str)
ch.setFormatter(formatter)
log.addHandler(ch)
log.setLevel(logging.INFO)  # default level that ctypesgen was using with original version


def error_message(msg, cls=None):
    log.error("%s", msg)


def warning_message(msg, cls=None):
    log.warning("%s", msg)


def status_message(msg):
    log.info("Status: %s", msg)


================================================
FILE: ctypesgen/options.py
================================================
"""
All of the components of ctypegencore require an argument called "options".
In command-line usage, this would be an argparse.Namespace object. However,
if ctypesgen is used as a standard Python module, constructing this object
would be a pain. So this module exists to provide a "default" options object
for convenience.
"""

import argparse
import copy

default_values = {
    "other_headers": [],
    "modules": [],
    "include_search_paths": [],
    "compile_libdirs": [],
    "runtime_libdirs": [],
    "cpp": "gcc -E",
    "allow_gnu_c": False,
    "cpp_defines": [],
    "cpp_undefines": [],
    "save_preprocessed_headers": None,
    "all_headers": False,
    "builtin_symbols": False,
    "include_symbols": [],
    "exclude_symbols": [],
    "show_all_errors": False,
    "show_long_errors": False,
    "show_macro_warnings": True,
    "header_template": None,
    "inserted_files": [],
    "other_known_names": [],
    "include_macros": True,
    "include_undefs": True,
    "libraries": [],
    "strip_build_path": None,
    "output_language": "py",
    "no_stddef_types": False,
    "no_gnu_types": False,
    "no_python_types": False,
    "debug_level": 0,
    "strip_prefixes": [],
    "embed_preamble": True,
    "no_load_library": False,
}


def get_default_options():
    return argparse.Namespace(**copy.deepcopy(default_values))


================================================
FILE: ctypesgen/parser/.gitignore
================================================
new_parsetab.py
parser.out


================================================
FILE: ctypesgen/parser/__init__.py
================================================
"""
This package parses C header files and generates lists of functions, typedefs,
variables, structs, unions, enums, macros, and constants. This package knows
nothing about the libraries themselves.

The public interface for this package is the function "parse". Use as follows:
>>> descriptions = parse(["inputfile1.h","inputfile2.h"], options)
where "options" is an argparse.Namespace object.

parse() returns a DescriptionCollection object. See ctypesgen.descriptions
for more information.

"""

from .datacollectingparser import DataCollectingParser


def parse(headers, options):
    parser = DataCollectingParser(headers, options)
    parser.parse()
    return parser.data()


__all__ = ["parse"]


================================================
FILE: ctypesgen/parser/cdeclarations.py
================================================
"""
This file contains classes that represent C declarations. cparser produces
declarations in this format, and ctypesparser reformats them into a format that
is not C-specific. The other modules don't need to touch these.
"""

__docformat__ = "restructuredtext"

# --------------------------------------------------------------------------
# C Object Model
# --------------------------------------------------------------------------


class Declaration(object):
    def __init__(self):
        self.declarator = None
        self.type = Type()
        self.storage = None
        self.attrib = Attrib()

    def __repr__(self):
        d = {"declarator": self.declarator, "type": self.type}
        if self.storage:
            d["storage"] = self.storage
        li = ["%s=%r" % (k, v) for k, v in d.items()]
        return "Declaration(%s)" % ", ".join(li)


class Declarator(object):
    pointer = None

    def __init__(self):
        self.identifier = None
        self.initializer = None
        self.array = None
        self.parameters = None
        self.bitfield = None
        self.attrib = Attrib()

    # make pointer read-only to catch mistakes early
    pointer = property(lambda self: None)

    def __repr__(self):
        s = self.identifier or ""
        if self.bitfield:
            s += f":{self.bitfield.value}"
        if self.array:
            s += repr(self.array)
        if self.initializer:
            s += " = %r" % self.initializer
        if self.parameters is not None:
            s += "(" + ", ".join([repr(p) for p in self.parameters]) + ")"
        return s


class Pointer(Declarator):
    pointer = None

    def __init__(self):
        super(Pointer, self).__init__()
        self.qualifiers = []

    def __repr__(self):
        q = ""
        if self.qualifiers:
            q = "<%s>" % " ".join(self.qualifiers)
        return "POINTER%s(%r)" % (q, self.pointer) + super(Pointer, self).__repr__()


class Array(object):
    def __init__(self):
        self.size = None
        self.array = None

    def __repr__(self):
        if self.size:
            a = "[%r]" % self.size
        else:
            a = "[]"
        if self.array:
            return repr(self.array) + a
        else:
            return a


class Parameter(object):
    def __init__(self):
        self.type = Type()
        self.storage = None
        self.declarator = None
        self.attrib = Attrib()

    def __repr__(self):
        d = {"type": self.type}
        if self.declarator:
            d["declarator"] = self.declarator
        if self.storage:
            d["storage"] = self.storage
        li = ["%s=%r" % (k, v) for k, v in d.items()]
        return "Parameter(%s)" % ", ".join(li)


class Type(object):
    def __init__(self):
        self.qualifiers = []
        self.specifiers = []

    def __repr__(self):
        return " ".join(self.qualifiers + [str(s) for s in self.specifiers])


# These are used only internally.


class StorageClassSpecifier(str):
    def __repr__(self):
        return "StorageClassSpecifier({})".format(str(self))


class TypeSpecifier(str):
    def __repr__(self):
        return "TypeSpecifier({})".format(str(self))


class StructTypeSpecifier(object):
    def __init__(self, is_union, attrib, tag, declarations):
        self.is_union = is_union
        self.attrib = attrib
        self.tag = tag
        self.declarations = declarations
        self.filename = None
        self.lineno = -1

    def __repr__(self):
        if self.is_union:
            s = "union"
        else:
            s = "struct"
        if self.attrib:
            attrs = list()
            for attr, val in self.attrib.items():
                if val and type(val) == str:
                    attrs.append("{}({})".format(attr, val))
                elif val:
                    attrs.append(attr)

            s += " __attribute__(({}))".format(",".join(attrs))
        if self.tag and type(self.tag) != int:
            s += " %s" % self.tag
        if self.declarations:
            s += " {%s}" % "; ".join([repr(d) for d in self.declarations])
        return s


class EnumSpecifier(object):
    def __init__(self, tag, enumerators, src=None):
        self.tag = tag
        self.enumerators = enumerators
        self.filename = None
        self.lineno = -1

    def __repr__(self):
        s = "enum"
        if self.tag:
            s += " %s" % self.tag
        if self.enumerators:
            s += " {%s}" % ", ".join([repr(e) for e in self.enumerators])
        return s


class Enumerator(object):
    def __init__(self, name, expression):
        self.name = name
        self.expression = expression

    def __repr__(self):
        s = self.name
        if self.expression:
            s += " = %r" % self.expression
        return s


class TypeQualifier(str):
    def __repr__(self):
        return "TypeQualifier({})".format(str(self))


class PragmaPack(object):
    DEFAULT = None

    def __init__(self):
        self.current = self.DEFAULT
        self.stack = list()

    def set_default(self):
        self.current = self.DEFAULT

    def push(self, id=None, value=None):
        item = (id, self.current)
        self.stack.append(item)

        if value is not None:
            self.current = value

    def pop(self, id=None):
        if not self.stack:
            if id:
                return (
                    "#pragma pack(pop, {id}) encountered without matching "
                    "#pragma pack(push, {id})".format(id=id),
                )
            else:
                return "#pragma pack(pop) encountered without matching #pragma pack(push)"

        item = None
        err = None

        if id is not None:
            i = len(self.stack) - 1
            while i >= 0 and self.stack[i][0] != id:
                i -= 1

            if i >= 0:
                item = self.stack[i]
                self.stack = self.stack[:i]
            else:
                err = (
                    "#pragma pack(pop, {id}) encountered without matching "
                    "#pragma pack(push, {id}); popped last".format(id=id)
                )

        if item is None:
            item = self.stack.pop()

        self.current = item[1]
        return err


pragma_pack = PragmaPack()


class Attrib(dict):
    def __init__(self, *a, **kw):
        if pragma_pack.current:
            super(Attrib, self).__init__(packed=True, aligned=[pragma_pack.current])
            super(Attrib, self).update(*a, **kw)
        else:
            super(Attrib, self).__init__(*a, **kw)
        self._unalias()

    def __repr__(self):
        return "Attrib({})".format(dict(self))

    def update(self, *a, **kw):
        super(Attrib, self).update(*a, **kw)
        self._unalias()

    def _unalias(self):
        """
        Check for any attribute aliases and remove leading/trailing '__'

        According to https://gcc.gnu.org/onlinedocs/gcc/Attribute-Syntax.html,
        an attribute can also be preceded/followed by a double underscore
        ('__').
        """

        self.pop(None, None)  # remove dummy empty attribute

        fixes = [attr for attr in self if attr.startswith("__") and attr.endswith("__")]
        for attr in fixes:
            self[attr[2 : (len(attr) - 2)]] = self.pop(attr)


def apply_specifiers(specifiers, declaration):
    """Apply specifiers to the declaration (declaration may be
    a Parameter instead)."""
    for s in specifiers:
        if type(s) == StorageClassSpecifier:
            if declaration.storage:
                # Multiple storage classes, technically an error... ignore it
                pass
            declaration.storage = s
        elif type(s) in (TypeSpecifier, StructTypeSpecifier, EnumSpecifier):
            declaration.type.specifiers.append(s)
        elif type(s) == TypeQualifier:
            declaration.type.qualifiers.append(s)
        elif type(s) == Attrib:
            declaration.attrib.update(s)


================================================
FILE: ctypesgen/parser/cgrammar.py
================================================
#!/usr/bin/env python3

"""This is a yacc grammar for C.

Derived from ANSI C grammar:
  * Lexicon: http://www.lysator.liu.se/c/ANSI-C-grammar-l.html
             http://www.quut.com/c/ANSI-C-grammar-l-2011.html
  * Grammar: http://www.lysator.liu.se/c/ANSI-C-grammar-y.html
             http://www.quut.com/c/ANSI-C-grammar-y-2011.html

Reference is C99:
  * http://www.open-std.org/JTC1/SC22/WG14/www/docs/n1124.pdf

Parts of C2X (C23) is included:
  * http://www.open-std.org/jtc1/sc22/wg14/www/docs/n2731.pdf
"""

__docformat__ = "restructuredtext"

if __name__ == "__main__":
    # NOTE if this file is modified, run to generate a new parsetab.py
    #   E.g.:
    #       env PYTHONPATH=. python ctypesgen/parser/cgrammar.py
    # new_parsetab.py is generated in the current directory and needs to be
    # manually copied (after inspection) to ctypesgen/parser/parsetab.py
    import sys
    import os

    sys.path.insert(0, os.path.join(os.path.pardir, os.path.pardir))
    from ctypesgen.parser.cgrammar import main

    main()
    sys.exit()

import os.path
import sys

from ctypesgen import expressions
from ctypesgen.ctypedescs import anonymous_struct_tagnum
from ctypesgen.parser import cdeclarations, yacc


reserved_keyword_tokens = (
    "SIZEOF", "TYPEDEF", "EXTERN", "STATIC", "AUTO", "REGISTER", "INLINE",
    "CONST", "RESTRICT", "VOLATILE",
    "CHAR", "SHORT", "INT", "LONG", "SIGNED", "UNSIGNED", "FLOAT", "DOUBLE",
    "VOID", "STRUCT", "UNION", "ENUM",

    "CASE", "DEFAULT", "IF", "ELSE", "SWITCH", "WHILE", "DO", "FOR", "GOTO",
    "CONTINUE", "BREAK", "RETURN",
)

reserved_keyword_tokens_new = (
    "_BOOL", "_NORETURN",
    # "_ALIGNAS", "_ALIGNOF", "_ATOMIC", "_COMPLEX",
    # "_DECIMAL128", "_DECIMAL32", "_DECIMAL64",
    # "_GENERIC", "_IMAGINARY", "_STATIC_ASSERT", "_THREAD_LOCAL",
)

extra_keywords_with_alias = {
    "__asm__": "__ASM__",
    "__attribute__": "__ATTRIBUTE__",
    "__restrict": "RESTRICT",
    "__inline__": "INLINE",
    "__inline": "INLINE",
}

keyword_map = {}
for keyword in reserved_keyword_tokens:
    keyword_map[keyword.lower()] = keyword
for keyword in reserved_keyword_tokens_new:
    keyword_map[keyword[:2].upper() + keyword[2:].lower()] = keyword
    keyword_map[keyword[1:].lower()] = keyword
keyword_map.update(extra_keywords_with_alias)

keywords = tuple(keyword_map.keys())

tokens = reserved_keyword_tokens + reserved_keyword_tokens_new + (
    # Identifier
    "IDENTIFIER",

    # Type identifiers
    "TYPE_NAME",
    # "FUNC_NAME",  "TYPEDEF_NAME",

    # Constants
    "STRING_LITERAL", "CHARACTER_CONSTANT",
    # "ENUMERATION_CONSTANT",
    "I_CONST_HEX", "I_CONST_DEC", "I_CONST_OCT", "I_CONST_BIN",
    "F_CONST_1", "F_CONST_2", "F_CONST_3", "F_CONST_4", "F_CONST_5", "F_CONST_6",

    # Operators
    "PLUS", "MINUS", "TIMES", "DIVIDE", "MOD", "AND",
    "OR", "NOT", "XOR", "LNOT", "LT", "GT", "CONDOP",
    "PTR_OP", "INC_OP", "DEC_OP", "LEFT_OP", "RIGHT_OP",
    "LE_OP", "GE_OP", "EQ_OP", "NE_OP", "AND_OP", "OR_OP",

    # Assignment
    "MUL_ASSIGN", "DIV_ASSIGN", "MOD_ASSIGN", "ADD_ASSIGN",
    "SUB_ASSIGN", "LEFT_ASSIGN", "RIGHT_ASSIGN", "AND_ASSIGN",
    "XOR_ASSIGN", "OR_ASSIGN", "EQUALS",

    # Preprocessor
    "PP_DEFINE", "PP_DEFINE_MACRO_NAME", "PP_DEFINE_NAME", "PP_END_DEFINE",
    "PP_IDENTIFIER_PASTE", "PP_MACRO_PARAM", "PP_STRINGIFY", "PP_UNDEFINE",
    # "PP_NUMBER",

    # Pragma
    "PRAGMA", "PRAGMA_END", "PRAGMA_PACK",

    # Delimiters
    "PERIOD", "ELLIPSIS", "LPAREN", "RPAREN", "LBRACKET",
    "RBRACKET", "LBRACE", "RBRACE", "COMMA", "SEMI",
    "COLON",

    "__ASM__", "__ATTRIBUTE__",
)


precedence = (("nonassoc", "IF"), ("nonassoc", "ELSE"))


def p_translation_unit(p):
    """ translation_unit :
                         | translation_unit external_declaration
                         | translation_unit directive
    """
    # Starting production.
    # Allow empty production so that files with no declarations are still
    #    valid.
    # Intentionally empty


def p_identifier(p):
    """ identifier : IDENTIFIER
                   | IDENTIFIER PP_IDENTIFIER_PASTE identifier
                   | PP_MACRO_PARAM PP_IDENTIFIER_PASTE identifier
                   | IDENTIFIER PP_IDENTIFIER_PASTE PP_MACRO_PARAM
                   | PP_MACRO_PARAM PP_IDENTIFIER_PASTE PP_MACRO_PARAM
    """
    if len(p) == 2:
        p[0] = expressions.IdentifierExpressionNode(p[1])
    else:
        # Should it be supported? It wouldn't be very hard to add support.
        # Basically, it would involve a new ExpressionNode called
        # an IdentifierPasteExpressionNode that took a list of strings and
        # ParameterExpressionNodes. Then it would generate code like
        # "locals()['%s' + '%s' + ...]" where %s was substituted with the
        # elements of the list. I haven't supported it yet because I think
        # it's unnecessary and a little too powerful.
        p[0] = expressions.UnsupportedExpressionNode(
            "Identifier pasting is not supported by ctypesgen."
        )


def p_constant_integer(p):
    """ constant : I_CONST_HEX
                 | I_CONST_DEC
                 | I_CONST_OCT
                 | I_CONST_BIN
    """
    constant = p[1]
    is_literal = True

    if constant.isdigit():
        is_literal = False
        constant = int(p[1])

    p[0] = expressions.ConstantExpressionNode(constant, is_literal=is_literal)


def p_constant_float(p):
    """ constant : F_CONST_1
                 | F_CONST_2
                 | F_CONST_3
                 | F_CONST_4
                 | F_CONST_5
                 | F_CONST_6
    """
    p[0] = expressions.ConstantExpressionNode(p[1], is_literal=True)


def p_constant_character(p):
    """ constant : CHARACTER_CONSTANT
    """
    constant_char = p[1]

    p[0] = expressions.ConstantExpressionNode(constant_char)


def p_string_literal(p):
    """ string_literal : STRING_LITERAL
    """
    p[0] = expressions.ConstantExpressionNode(p[1])


def p_multi_string_literal(p):
    """ multi_string_literal : string_literal
                             | macro_param
                             | multi_string_literal string_literal
                             | multi_string_literal macro_param
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        p[0] = expressions.BinaryExpressionNode(
            "string concatenation", (lambda x, y: x + y), "(%s + %s)", (False, False), p[1], p[2]
        )


def p_macro_param(p):
    """ macro_param : PP_MACRO_PARAM
                    | PP_STRINGIFY PP_MACRO_PARAM
    """
    if len(p) == 2:
        p[0] = expressions.ParameterExpressionNode(p[1])
    else:
        p[0] = expressions.ParameterExpressionNode(p[2])


def p_primary_expression(p):
    """ primary_expression : identifier
                           | constant
                           | multi_string_literal
                           | LPAREN expression RPAREN
    """
    if p[1] == "(":
        p[0] = p[2]
    else:
        p[0] = p[1]


def p_postfix_expression(p):
    """ postfix_expression : primary_expression
                           | postfix_expression LBRACKET expression RBRACKET
                           | postfix_expression LPAREN RPAREN
                           | postfix_expression LPAREN argument_expression_list RPAREN
                           | postfix_expression PERIOD IDENTIFIER
                           | postfix_expression PTR_OP IDENTIFIER
                           | postfix_expression INC_OP
                           | postfix_expression DEC_OP
    """

    if len(p) == 2:
        p[0] = p[1]

    elif p[2] == "[":
        p[0] = expressions.BinaryExpressionNode(
            "array access", (lambda a, b: a[b]), "(%s [%s])", (True, False), p[1], p[3]
        )

    elif p[2] == "(":
        if p[3] == ")":
            p[0] = expressions.CallExpressionNode(p[1], [])
        else:
            p[0] = expressions.CallExpressionNode(p[1], p[3])

    elif p[2] == ".":
        p[0] = expressions.AttributeExpressionNode(
            (lambda x, a: getattr(x, a)), "(%s.%s)", p[1], p[3]
        )

    elif p[2] == "->":
        p[0] = expressions.AttributeExpressionNode(
            (lambda x, a: getattr(x.contents, a)), "(%s.contents.%s)", p[1], p[3]
        )

    elif p[2] == "++":
        p[0] = expressions.UnaryExpressionNode(
            "increment", (lambda x: x + 1), "(%s + 1)", False, p[1]
        )

    elif p[2] == "--":
        p[0] = expressions.UnaryExpressionNode(
            "decrement", (lambda x: x - 1), "(%s - 1)", False, p[1]
        )


def p_argument_expression_list(p):
    """ argument_expression_list : assignment_expression
                                 | argument_expression_list COMMA assignment_expression
                                 | type_name
                                 | argument_expression_list COMMA type_name
    """
    if len(p) == 4:
        p[1].append(p[3])
        p[0] = p[1]
    else:
        p[0] = [p[1]]


def p_asm_expression(p):
    """ asm_expression : __ASM__ volatile_opt LPAREN string_literal RPAREN
                       | __ASM__ volatile_opt LPAREN string_literal COLON str_opt_expr_pair_list RPAREN
                       | __ASM__ volatile_opt LPAREN string_literal COLON str_opt_expr_pair_list COLON str_opt_expr_pair_list RPAREN
                       | __ASM__ volatile_opt LPAREN string_literal COLON str_opt_expr_pair_list COLON str_opt_expr_pair_list COLON str_opt_expr_pair_list RPAREN
    """

    # Definitely not ISO C, adapted from example ANTLR GCC parser at
    #  http://www.antlr.org/grammar/cgram//grammars/GnuCParser.g
    # but more lenient (expressions permitted in optional final part, when
    # they shouldn't be -- avoids shift/reduce conflict with
    # str_opt_expr_pair_list).

    p[0] = expressions.UnsupportedExpressionNode("This node is ASM assembler.")


def p_str_opt_expr_pair_list(p):
    """ str_opt_expr_pair_list :
                               | str_opt_expr_pair
                               | str_opt_expr_pair_list COMMA str_opt_expr_pair
    """


def p_str_opt_expr_pair(p):
    """ str_opt_expr_pair : string_literal
                          | string_literal LPAREN expression RPAREN
    """


def p_volatile_opt(p):
    """ volatile_opt :
                     | VOLATILE
    """


prefix_ops_dict = {
    "++": ("increment", (lambda x: x + 1), "(%s + 1)", False),
    "--": ("decrement", (lambda x: x - 1), "(%s - 1)", False),
    "&": ("reference ('&')", None, "pointer(%s)", True),
    "*": ("dereference ('*')", None, "(%s[0])", True),
    "+": ("unary '+'", (lambda x: x), "%s", True),
    "-": ("negation", (lambda x: -x), "(-%s)", False),
    "~": ("inversion", (lambda x: ~x), "(~%s)", False),
    "!": ("logical not", (lambda x: not x), "(not %s)", True),
}


def p_unary_expression(p):
    """ unary_expression : postfix_expression
                         | INC_OP unary_expression
                         | DEC_OP unary_expression
                         | unary_operator cast_expression
                         | SIZEOF unary_expression
                         | SIZEOF LPAREN type_name RPAREN
                         | asm_expression
    """
    if len(p) == 2:
        p[0] = p[1]

    elif p[1] == "sizeof":
        if len(p) == 5:
            p[0] = expressions.SizeOfExpressionNode(p[3])
        else:
            p[0] = expressions.SizeOfExpressionNode(p[2])

    else:
        name, op, format, can_be_ctype = prefix_ops_dict[p[1]]
        p[0] = expressions.UnaryExpressionNode(name, op, format, can_be_ctype, p[2])


def p_unary_operator(p):
    """ unary_operator : AND
                       | TIMES
                       | PLUS
                       | MINUS
                       | NOT
                       | LNOT
    """
    p[0] = p[1]


def p_cast_expression(p):
    """ cast_expression : unary_expression
                        | LPAREN type_name RPAREN cast_expression
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        p[0] = expressions.TypeCastExpressionNode(p[4], p[2])


mult_ops_dict = {
    "*": ("multiplication", (lambda x, y: x * y), "(%s * %s)"),
    "/": ("division", (lambda x, y: x / y), "(%s / %s)"),
    "%": ("modulo", (lambda x, y: x % y), "(%s %% %s)"),
}


def p_multiplicative_expression(p):
    """ multiplicative_expression : cast_expression
                                  | multiplicative_expression TIMES cast_expression
                                  | multiplicative_expression DIVIDE cast_expression
                                  | multiplicative_expression MOD cast_expression
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        name, op, format = mult_ops_dict[p[2]]
        p[0] = expressions.BinaryExpressionNode(name, op, format, (False, False), p[1], p[3])


add_ops_dict = {
    "+": ("addition", (lambda x, y: x + y), "(%s + %s)"),
    "-": ("subtraction", (lambda x, y: x - y), "(%s - %s)"),
}


def p_additive_expression(p):
    """ additive_expression : multiplicative_expression
                            | additive_expression PLUS multiplicative_expression
                            | additive_expression MINUS multiplicative_expression
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        name, op, format = add_ops_dict[p[2]]
        p[0] = expressions.BinaryExpressionNode(name, op, format, (False, False), p[1], p[3])


shift_ops_dict = {
    ">>": ("right shift", (lambda x, y: x >> y), "(%s >> %s)"),
    "<<": ("left shift", (lambda x, y: x << y), "(%s << %s)"),
}


def p_shift_expression(p):
    """ shift_expression : additive_expression
                         | shift_expression LEFT_OP additive_expression
                         | shift_expression RIGHT_OP additive_expression
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        name, op, format = shift_ops_dict[p[2]]
        p[0] = expressions.BinaryExpressionNode(name, op, format, (False, False), p[1], p[3])


rel_ops_dict = {
    ">": ("greater-than", (lambda x, y: x > y), "(%s > %s)"),
    "<": ("less-than", (lambda x, y: x < y), "(%s < %s)"),
    ">=": ("greater-than-equal", (lambda x, y: x >= y), "(%s >= %s)"),
    "<=": ("less-than-equal", (lambda x, y: x <= y), "(%s <= %s)"),
}


def p_relational_expression(p):
    """ relational_expression : shift_expression
                              | relational_expression LT shift_expression
                              | relational_expression GT shift_expression
                              | relational_expression LE_OP shift_expression
                              | relational_expression GE_OP shift_expression
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        name, op, format = rel_ops_dict[p[2]]
        p[0] = expressions.BinaryExpressionNode(name, op, format, (False, False), p[1], p[3])


equality_ops_dict = {
    "==": ("equals", (lambda x, y: x == y), "(%s == %s)"),
    "!=": ("not equals", (lambda x, y: x != y), "(%s != %s)"),
}


def p_equality_expression(p):
    """ equality_expression : relational_expression
                            | equality_expression EQ_OP relational_expression
                            | equality_expression NE_OP relational_expression
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        name, op, format = equality_ops_dict[p[2]]
        p[0] = expressions.BinaryExpressionNode(name, op, format, (False, False), p[1], p[3])


def p_and_expression(p):
    """ and_expression : equality_expression
                       | and_expression AND equality_expression
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        p[0] = expressions.BinaryExpressionNode(
            "bitwise and", (lambda x, y: x & y), "(%s & %s)", (False, False), p[1], p[3]
        )


def p_exclusive_or_expression(p):
    """ exclusive_or_expression : and_expression
                                | exclusive_or_expression XOR and_expression
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        p[0] = expressions.BinaryExpressionNode(
            "bitwise xor", (lambda x, y: x ^ y), "(%s ^ %s)", (False, False), p[1], p[3]
        )


def p_inclusive_or_expression(p):
    """ inclusive_or_expression : exclusive_or_expression
                                | inclusive_or_expression OR exclusive_or_expression
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        p[0] = expressions.BinaryExpressionNode(
            "bitwise or", (lambda x, y: x | y), "(%s | %s)", (False, False), p[1], p[3]
        )


def p_logical_and_expression(p):
    """ logical_and_expression : inclusive_or_expression
                               | logical_and_expression AND_OP inclusive_or_expression
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        p[0] = expressions.BinaryExpressionNode(
            "logical and", (lambda x, y: x and y), "(%s and %s)", (True, True), p[1], p[3]
        )


def p_logical_or_expression(p):
    """ logical_or_expression : logical_and_expression
                              | logical_or_expression OR_OP logical_and_expression
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        p[0] = expressions.BinaryExpressionNode(
            "logical and", (lambda x, y: x or y), "(%s or %s)", (True, True), p[1], p[3]
        )


def p_conditional_expression(p):
    """ conditional_expression : logical_or_expression
                               | logical_or_expression CONDOP expression COLON conditional_expression
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        p[0] = expressions.ConditionalExpressionNode(p[1], p[3], p[5])


assign_ops_dict = {
    "*=": ("multiply", (lambda x, y: x * y), "(%s * %s)"),
    "/=": ("divide", (lambda x, y: x / y), "(%s / %s)"),
    "%=": ("modulus", (lambda x, y: x % y), "(%s % %s)"),
    "+=": ("addition", (lambda x, y: x + y), "(%s + %s)"),
    "-=": ("subtraction", (lambda x, y: x - y), "(%s - %s)"),
    "<<=": ("left shift", (lambda x, y: x << y), "(%s << %s)"),
    ">>=": ("right shift", (lambda x, y: x >> y), "(%s >> %s)"),
    "&=": ("bitwise and", (lambda x, y: x & y), "(%s & %s)"),
    "^=": ("bitwise xor", (lambda x, y: x ^ y), "(%s ^ %s)"),
    "|=": ("bitwise or", (lambda x, y: x | y), "(%s | %s)"),
}


def p_assignment_expression(p):
    """ assignment_expression : conditional_expression
                              | unary_expression assignment_operator assignment_expression
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        # In C, the value of (x*=3) is the same as (x*3). We support that here.
        # However, we don't support the change in the value of x.
        if p[2] == "=":
            p[0] = p[3]
        else:
            name, op, format = assign_ops_dict[p[2]]
            p[0] = expressions.BinaryExpressionNode(name, op, format, (True, True), p[1], p[3])


def p_assignment_operator(p):
    """ assignment_operator : EQUALS
                            | MUL_ASSIGN
                            | DIV_ASSIGN
                            | MOD_ASSIGN
                            | ADD_ASSIGN
                            | SUB_ASSIGN
                            | LEFT_ASSIGN
                            | RIGHT_ASSIGN
                            | AND_ASSIGN
                            | XOR_ASSIGN
                            | OR_ASSIGN
    """
    p[0] = p[1]


def p_expression(p):
    """ expression : assignment_expression
                   | expression COMMA assignment_expression
    """
    p[0] = p[1]
    # We don't need to support sequence expressions...


def p_constant_expression(p):
    """ constant_expression : conditional_expression
    """
    p[0] = p[1]


def p_declaration(p):
    """ declaration : declaration_impl SEMI
    """
    # The ';' must be here, not in 'declaration', as declaration needs to
    # be executed before the ';' is shifted (otherwise the next lookahead will
    # be read, which may be affected by this declaration if its a typedef.


def p_declaration_impl(p):
    """ declaration_impl : declaration_specifier_list
                         | declaration_specifier_list init_declarator_list
    """
    declaration = cdeclarations.Declaration()
    cdeclarations.apply_specifiers(p[1], declaration)

    if len(p) == 2:
        filename = p.slice[1].filename
        lineno = p.slice[1].lineno
        p.parser.cparser.impl_handle_declaration(declaration, filename, lineno)
        return

    filename = p.slice[2].filename
    lineno = p.slice[2].lineno
    for declarator in p[2]:
        declaration.declarator = declarator
        p.parser.cparser.impl_handle_declaration(declaration, filename, lineno)


def p_declaration_specifier_list(p):
    """ declaration_specifier_list : gcc_attributes declaration_specifier gcc_attributes
                                   | declaration_specifier_list declaration_specifier gcc_attributes
    """
    if type(p[1]) == cdeclarations.Attrib:
        p[0] = (p[1], p[2], p[3])
        p.slice[0].filename = p.slice[2].filename
        p.slice[0].lineno = p.slice[2].lineno
    else:
        p[0] = p[1] + (p[2], p[3])
        p.slice[0].filename = p.slice[1].filename
        p.slice[0].lineno = p.slice[1].lineno


def p_declaration_specifier(p):
    """ declaration_specifier : storage_class_specifier
                              | type_specifier
                              | type_qualifier
                              | function_specifier
    """
    p[0] = p[1]


def p_init_declarator_list(p):
    """ init_declarator_list : init_declarator
                             | init_declarator_list COMMA init_declarator
    """
    if len(p) > 2:
        p[0] = p[1] + (p[3],)
    else:
        p[0] = (p[1],)


def p_init_declarator(p):
    """ init_declarator : declarator gcc_attributes
                        | declarator gcc_attributes EQUALS initializer
    """
    p[0] = p[1]
    p[0].attrib.update(p[2])
    p.slice[0].filename = p.slice[1].filename
    p.slice[0].lineno = p.slice[1].lineno
    if len(p) > 3:
        p[0].initializer = p[4]


def p_storage_class_specifier(p):
    """ storage_class_specifier : TYPEDEF
                                | EXTERN
                                | STATIC
                                | AUTO
                                | REGISTER
    """
    p[0] = cdeclarations.StorageClassSpecifier(p[1])


def p_type_specifier(p):
    """ type_specifier : VOID
                       | _BOOL
                       | CHAR
                       | SHORT
                       | INT
                       | LONG
                       | FLOAT
                       | DOUBLE
                       | SIGNED
                       | UNSIGNED
                       | struct_or_union_specifier
                       | enum_specifier
                       | TYPE_NAME
    """
    if type(p[1]) in (cdeclarations.StructTypeSpecifier, cdeclarations.EnumSpecifier):
        p[0] = p[1]
    else:
        p[0] = cdeclarations.TypeSpecifier(p[1])


def p_struct_or_union_specifier(p):
    """ struct_or_union_specifier : struct_or_union gcc_attributes IDENTIFIER LBRACE member_declaration_list RBRACE
                                  | struct_or_union gcc_attributes TYPE_NAME LBRACE member_declaration_list RBRACE
                                  | struct_or_union gcc_attributes LBRACE member_declaration_list RBRACE
                                  | struct_or_union gcc_attributes IDENTIFIER
                                  | struct_or_union gcc_attributes TYPE_NAME
    """
    # format of grammar for gcc_attributes taken from c-parser.c in GCC source.
    # The TYPE_NAME ones are dodgy, needed for Apple headers
    # CoreServices.framework/Frameworks/CarbonCore.framework/Headers/Files.h.
    # CoreServices.framework/Frameworks/OSServices.framework/Headers/Power.h
    tag = None
    decl = None

    if len(p) == 4:  # struct [attributes] <id/typname>
        tag = p[3]
    elif p[3] == "{":
        tag, decl = anonymous_struct_tagnum(), p[4]
    else:
        tag, decl = p[3], p[5]

    p[0] = cdeclarations.StructTypeSpecifier(p[1], p[2], tag, decl)

    p.slice[0].filename = p.slice[1].filename
    p.slice[0].lineno = p.slice[1].lineno
    p[0].filename = p.slice[1].filename
    p[0].lineno = p.slice[1].lineno


def p_struct_or_union(p):
    """ struct_or_union : STRUCT
                        | UNION
    """
    p[0] = p[1] == "union"


def p_gcc_attributes(p):
    """ gcc_attributes :
                       | gcc_attributes gcc_attribute
    """
    # Allow empty production on attributes (take from c-parser.c in GCC source)
    if len(p) == 1:
        p[0] = cdeclarations.Attrib()
    else:
        p[0] = p[1]
        p[0].update(p[2])


def p_gcc_attribute(p):
    """ gcc_attribute : __ATTRIBUTE__ LPAREN LPAREN gcc_attrib_list RPAREN RPAREN
    """
    p[0] = cdeclarations.Attrib()
    p[0].update(p[4])


def p_gcc_attrib_list(p):
    """ gcc_attrib_list : gcc_attrib
                        | gcc_attrib_list COMMA gcc_attrib
    """
    if len(p) == 2:
        p[0] = (p[1],)
    else:
        p[0] = p[1] + (p[3],)


def p_gcc_attrib(p):
    """ gcc_attrib :
                   | IDENTIFIER
                   | IDENTIFIER LPAREN argument_expression_list RPAREN
    """
    if len(p) == 1:
        p[0] = (None, None)
    elif len(p) == 2:
        p[0] = (p[1], True)
    elif len(p) == 5:
        p[0] = (p[1], p[3])
    else:
        raise RuntimeError("Should never reach this part of the grammar")


def p_member_declaration_list(p):
    """ member_declaration_list : member_declaration
                                | member_declaration_list member_declaration
    """
    if len(p) == 2:
        p[0] = p[1]
    else:
        p[0] = p[1] + p[2]


def p_member_declaration(p):
    """ member_declaration : specifier_qualifier_list member_declarator_list SEMI
                           | specifier_qualifier_list SEMI
    """
    # p[0] returned is a tuple, to handle multiple declarators in one
    # declaration.
    r = ()
    if len(p) >= 4:
        for declarator in p[2]:
            declaration = cdeclarations.Declaration()
            cdeclarations.apply_specifiers(p[1], declaration)
            declaration.declarator = declarator
            r += (declaration,)
    else:
        # anonymous field (C11/GCC extension)
        declaration = cdeclarations.Declaration()
        cdeclarations.apply_specifiers(p[1], declaration)
        r = (declaration,)

    p[0] = r


def p_specifier_qualifier_list(p):
    """ specifier_qualifier_list : gcc_attributes specifier_qualifier gcc_attributes
                                 | specifier_qualifier_list specifier_qualifier gcc_attributes
    """
    if type(p[1]) == cdeclarations.Attrib:
        p[0] = (p[1], p[2], p[3])
    else:
        p[0] = p[1] + (p[2], p[3])


def p_specifier_qualifier(p):
    """ specifier_qualifier : type_specifier
                            | type_qualifier
    """
    p[0] = p[1]


def p_member_declarator_list(p):
    """ member_declarator_list : member_declarator
                               | member_declarator_list COMMA member_declarator
    """
    if len(p) == 2:
        p[0] = (p[1],)
    else:
        p[0] = p[1] + (p[3],)


def p_member_declarator(p):
    """ member_declarator : declarator gcc_attributes
                          | COLON constant_expression gcc_attributes
                          | declarator COLON constant_expression gcc_attributes
    """
    if p[1] == ":":
        p[0] = cdeclarations.Declarator()
        p[0].bitfield = p[2]
    else:
        p[0] = p[1]
        # Bitfield support
        if p[2] == ":":
            p[0].bitfield = p[3]

    p[0].attrib.update(p[len(p) - 1])


def p_enum_specifier(p):
    """ enum_specifier : ENUM LBRACE enumerator_list RBRACE
                       | ENUM IDENTIFIER LBRACE enumerator_list RBRACE
                       | ENUM IDENTIFIER
    """
    if len(p) == 5:
        p[0] = cdeclarations.EnumSpecifier(None, p[3])
    elif len(p) == 6:
        p[0] = cdeclarations.EnumSpecifier(p[2], p[4])
    else:
        p[0] = cdeclarations.EnumSpecifier(p[2], ())

    p[0].filename = p.slice[0].filename
    p[0].lineno = p.slice[0].lineno


def p_enumerator_list(p):
    """ enumerator_list : enumerator_list_iso
                        | enumerator_list_iso COMMA
    """
    # Apple headers sometimes have trailing ',' after enumerants, which is
    # not ISO C.
    p[0] = p[1]


def p_enumerator_list_iso(p):
    """ enumerator_list_iso : enumerator
                            | enumerator_list_iso COMMA enumerator
    """
    if len(p) == 2:
        p[0] = (p[1],)
    else:
        p[0] = p[1] + (p[3],)


def p_enumerator(p):
    """ enumerator : IDENTIFIER
                   | IDENTIFIER EQUALS constant_expression
    """
    if len(p) == 2:
        p[0] = cdeclarations.Enumerator(p[1], None)
    else:
        p[0] = cdeclarations.Enumerator(p[1], p[3])


def p_type_qualifier(p):
    """ type_qualifier : CONST
                       | VOLATILE
                       | RESTRICT
    """
    p[0] = cdeclarations.TypeQualifier(p[1])


def p_function_specifier(p):
    """ function_specifier : INLINE
                           | _NORETURN
    """


def p_declarator(p):
    """ declarator : pointer direct_declarator
                   | direct_declarator
    """
    if len(p) > 2:
        p[0] = p[1]
        ptr = p[1]
        while ptr.pointer:
            ptr = ptr.pointer
        ptr.pointer = p[2]
        p[2].attrib.update(p[1].attrib)
    else:
        p[0] = p[1]


def p_direct_declarator(p):
    """ direct_declarator : IDENTIFIER
                          | LPAREN gcc_attributes declarator RPAREN
                          | direct_declarator LBRACKET constant_expression RBRACKET
                          | direct_declarator LBRACKET RBRACKET
                          | direct_declarator LPAREN parameter_type_list RPAREN
                          | direct_declarator LPAREN identifier_list RPAREN
                          | direct_declarator LPAREN RPAREN
    """
    if isinstance(p[1], cdeclarations.Declarator):
        p[0] = p[1]
        if p[2] == "[":
            a = cdeclarations.Array()
            a.array = p[0].array
            p[0].array = a
            if p[3] != "]":
                a.size = p[3]
        else:
            if p[3] == ")":
                p[0].parameters = ()
            else:
                p[0].parameters = p[3]
    elif p[1] == "(":
        p[0] = p[3]
        p[3].attrib.update(p[2])
    else:
        p[0] = cdeclarations.Declarator()
        p[0].identifier = p[1]

    # Check parameters for (void) and simplify to empty tuple.
    if p[0].parameters and len(p[0].parameters) == 1:
        param = p[0].parameters[0]
        if param.type.specifiers == ["void"] and not param.declarator:
            p[0].parameters = ()


def p_pointer(p):
    """ pointer : TIMES
                | TIMES type_qualifier_list
                | TIMES pointer
                | TIMES type_qualifier_list pointer
    """
    if len(p) == 2:
        p[0] = cdeclarations.Pointer()
    elif len(p) == 3 and isinstance(p[2], cdeclarations.Pointer):
        p[0] = cdeclarations.Pointer()
        p[0].pointer = p[2]
        p[0].attrib.update(p[2].attrib)
    else:
        p[0] = cdeclarations.Pointer()
        for tq in p[2]:
            if isinstance(tq, cdeclarations.Attrib):
                p[0].attrib.update(tq)
            else:
                p[0].qualifiers += (tq,)

        if len(p) == 4:
            p[0].pointer = p[3]
            p[0].attrib.update(p[3].attrib)


def p_type_qualifier_list(p):
    """ type_qualifier_list : type_qualifier
                            | gcc_attribute
                            | type_qualifier_list type_qualifier
                            | type_qualifier_list gcc_attribute
    """
    if len(p) > 2:
        p[0] = p[1] + (p[2],)
    else:
        p[0] = (p[1],)


def p_parameter_type_list(p):
    """ parameter_type_list : parameter_list
                            | parameter_list COMMA ELLIPSIS
    """
    if len(p) > 2:
        p[0] = p[1] + (p[3],)
    else:
        p[0] = p[1]


def p_parameter_list(p):
    """ parameter_list : parameter_declaration
                       | parameter_list COMMA parameter_declaration
    """
    if len(p) > 2:
        p[0] = p[1] + (p[3],)
    else:
        p[0] = (p[1],)


def p_parameter_declaration(p):
    """ parameter_declaration : declaration_specifier_list declarator gcc_attributes
                              | declaration_specifier_list abstract_declarator
                              | declaration_specifier_list
    """
    p[0] = cdeclarations.Parameter()
    specs = p[1]

    if len(p) == 4:
        # add the attributes as a final specifier
        specs += (p[3],)
        p[0].declarator = p[2]
    elif len(p) == 3:
        p[0].declarator = p[2]

    cdeclarations.apply_specifiers(specs, p[0])


def p_identifier_list(p):
    """ identifier_list : IDENTIFIER
                        | identifier_list COMMA IDENTIFIER
    """
    param = cdeclarations.Parameter()
    param.declarator = cdeclarations.Declarator()
    if len(p) > 2:
        param.declarator.identifier = p[3]
        p[0] = p[1] + (param,)
    else:
        param.declarator.identifier = p[1]
        p[0] = (param,)


def p_type_name(p):
    """ type_name : specifier_qualifier_list
                  | specifier_qualifier_list abstract_declarator
    """
    typ = p[1]
    if len(p) == 3:
        declarator = p[2]
    else:
        declarator = None

    declaration = cdeclarations.Declaration()
    declaration.declarator = declarator
    cdeclarations.apply_specifiers(typ, declaration)
    ctype = p.parser.cparser.get_ctypes_type(declaration.type, declaration.declarator)
    p[0] = ctype


def p_abstract_declarator(p):
    """ abstract_declarator : pointer
                            | direct_abstract_declarator         gcc_attributes
                            | pointer direct_abstract_declarator gcc_attributes
    """
    if len(p) == 2:
        p[0] = p[1]
        ptr = p[0]
        while ptr.pointer:
            ptr = ptr.pointer
        # Only if doesn't already terminate in a declarator
        if type(ptr) == cdeclarations.Pointer:
            ptr.pointer = cdeclarations.Declarator()
            ptr.pointer.attrib.update(p[1].attrib)
        else:
            ptr.attrib.update(p[1].attrib)
    elif len(p) == 3:
        p[0] = p[1]
        p[1].attrib.update(p[2])
    else:
        p[0] = p[1]
        ptr = p[0]
        while ptr.pointer:
            ptr = ptr.pointer
        ptr.pointer = p[2]
        p[2].attrib.update(p[1].attrib)
        p[2].attrib.update(p[3])


def p_direct_abstract_declarator(p):
    """ direct_abstract_declarator : LPAREN gcc_attributes abstract_declarator RPAREN
                                   | LBRACKET RBRACKET
                                   | LBRACKET constant_expression RBRACKET
                                   | direct_abstract_declarator LBRACKET RBRACKET
                                   | direct_abstract_declarator LBRACKET constant_expression RBRACKET
                                   | LPAREN RPAREN
                                   | LPAREN parameter_type_list RPAREN
                                   | direct_abstract_declarator LPAREN RPAREN
                                   | direct_abstract_declarator LPAREN parameter_type_list RPAREN
    """
    if p[1] == "(" and isinstance(p[3], cdeclarations.Declarator):
        p[0] = p[3]
        p[3].attrib.update(p[2])
    else:
        if isinstance(p[1], cdeclarations.Declarator):
            p[0] = p[1]
            if p[2] == "[":
                a = cdeclarations.Array()
                a.array = p[0].array
                p[0].array = a
                if p[3] != "]":
                    p[0].array.size = p[3]
            elif p[2] == "(":
                if p[3] == ")":
                    p[0].parameters = ()
                else:
                    p[0].parameters = p[3]
        else:
            p[0] = cdeclarations.Declarator()
            if p[1] == "[":
                p[0].array = cdeclarations.Array()
                if p[2] != "]":
                    p[0].array.size = p[2]
            elif p[1] == "(":
                if p[2] == ")":
                    p[0].parameters = ()
                else:
                    p[0].parameters = p[2]

    # Check parameters for (void) and simplify to empty tuple.
    if p[0].parameters and len(p[0].parameters) == 1:
        param = p[0].parameters[0]
        if param.type.specifiers == ["void"] and not param.declarator:
            p[0].parameters = ()


def p_initializer(p):
    """ initializer : assignment_expression
                    | LBRACE initializer_list RBRACE
                    | LBRACE initializer_list COMMA RBRACE
    """


def p_initializer_list(p):
    """ initializer_list : initializer
                         | initializer_list COMMA initializer
    """


def p_statement(p):
    """ statement : labeled_statement
                  | compound_statement
                  | expression_statement
                  | selection_statement
                  | iteration_statement
                  | jump_statement
    """


def p_labeled_statement(p):
    """ labeled_statement : IDENTIFIER COLON statement
                          | CASE constant_expression COLON statement
                          | DEFAULT COLON statement
    """


def p_compound_statement(p):
    """ compound_statement : LBRACE RBRACE
                           | LBRACE statement_list RBRACE
                           | LBRACE declaration_list RBRACE
                           | LBRACE declaration_list statement_list RBRACE
    """


def p_compound_statement_error(p):
    """ compound_statement : LBRACE error RBRACE
    """
    # Error resynchronisation catch-all


def p_declaration_list(p):
    """ declaration_list : declaration
                         | declaration_list declaration
    """


def p_statement_list(p):
    """ statement_list : statement
                       | statement_list statement
    """


def p_expression_statement(p):
    """ expression_statement : SEMI
                             | expression SEMI
    """


def p_expression_statement_error(p):
    """ expression_statement : error SEMI
    """
    # Error resynchronisation catch-all


def p_selection_statement(p):
    """ selection_statement : IF LPAREN expression RPAREN statement %prec IF
                            | IF LPAREN expression RPAREN statement ELSE statement
                            | SWITCH LPAREN expression RPAREN statement
    """


def p_iteration_statement(p):
    """ iteration_statement : WHILE LPAREN expression RPAREN statement
                            | DO statement WHILE LPAREN expression RPAREN SEMI
                            | FOR LPAREN expression_statement expression_statement RPAREN statement
                            | FOR LPAREN expression_statement expression_statement expression RPAREN statement
    """


def p_jump_statement(p):
    """ jump_statement : GOTO IDENTIFIER SEMI
                       | CONTINUE SEMI
                       | BREAK SEMI
                       | RETURN SEMI
                       | RETURN expression SEMI
    """


def p_external_declaration(p):
    """ external_declaration : declaration
                             | function_definition
    """
    # Intentionally empty


def p_function_definition(p):
    """ function_definition : declaration_specifier_list declarator declaration_list compound_statement
                            | declaration_specifier_list declarator compound_statement
                            | declarator declaration_list compound_statement
                            | declarator compound_statement
    """
    # No impl of function defs


def p_directive(p):
    """ directive : define
                  | undefine
                  | pragma
    """


def p_define(p):
    """ define : PP_DEFINE PP_DEFINE_NAME PP_END_DEFINE
               | PP_DEFINE PP_DEFINE_NAME type_name PP_END_DEFINE
               | PP_DEFINE PP_DEFINE_NAME constant_expression PP_END_DEFINE
               | PP_DEFINE PP_DEFINE_MACRO_NAME LPAREN RPAREN PP_END_DEFINE
               | PP_DEFINE PP_DEFINE_MACRO_NAME LPAREN RPAREN constant_expression PP_END_DEFINE
               | PP_DEFINE PP_DEFINE_MACRO_NAME LPAREN macro_parameter_list RPAREN PP_END_DEFINE
               | PP_DEFINE PP_DEFINE_MACRO_NAME LPAREN macro_parameter_list RPAREN constant_expression PP_END_DEFINE
    """
    filename = p.slice[1].filename
    lineno = p.slice[1].lineno

    if p[3] != "(":
        if len(p) == 4:
            p.parser.cparser.handle_define_constant(p[2], None, filename, lineno)
        else:
            p.parser.cparser.handle_define_constant(p[2], p[3], filename, lineno)
    else:
        if p[4] == ")":
            params = []
            if len(p) == 6:
                expr = None
            elif len(p) == 7:
                expr = p[5]
        else:
            params = p[4]
            if len(p) == 7:
                expr = None
            elif len(p) == 8:
                expr = p[6]

        filename = p.slice[1].filename
        lineno = p.slice[1].lineno

        p.parser.cparser.handle_define_macro(p[2], params, expr, filename, lineno)


def p_define_error(p):
    """ define : PP_DEFINE error PP_END_DEFINE
    """
    lexer = p[2].lexer
    clexdata = lexer.tokens
    start = end = p[2].clexpos
    while clexdata[start].type != "PP_DEFINE":
        start -= 1
    while clexdata[end].type != "PP_END_DEFINE":
        end += 1

    name = clexdata[start + 1].value
    if clexdata[start + 1].type == "PP_DEFINE_NAME":
        params = None
        contents = [t.value for t in clexdata[start + 2 : end]]
    else:
        end_of_param_list = start
        while clexdata[end_of_param_list].value != ")" and end_of_param_list < end:
            end_of_param_list += 1
        params = [t.value for t in clexdata[start + 3 : end_of_param_list] if t.value != ","]
        contents = [t.value for t in clexdata[end_of_param_list + 1 : end]]

    filename = p.slice[1].filename
    lineno = p.slice[1].lineno

    p[2].lexer.cparser.handle_define_unparseable(name, params, contents, filename, lineno)


def p_undefine(p):
    """ undefine : PP_UNDEFINE PP_DEFINE_NAME PP_END_DEFINE
    """

    filename = p.slice[1].filename
    lineno = p.slice[1].lineno

    macro = expressions.IdentifierExpressionNode(p[2])
    p.parser.cparser.handle_undefine(macro, filename, lineno)


def p_macro_parameter_list(p):
    """ macro_parameter_list : PP_MACRO_PARAM
                             | macro_parameter_list COMMA PP_MACRO_PARAM
    """
    if len(p) == 2:
        p[0] = [p[1]]
    else:
        p[1].append(p[3])
        p[0] = p[1]


def p_error(t):
    if t.lexer.in_define:
        # p_define_error will generate an error message.
        pass
    else:
        if t.type == "$end":
            t.parser.cparser.handle_error("Syntax error at end of file.", t.filename, 0)
        else:
            t.lexer.cparser.handle_error("Syntax error at %r" % t.value, t.filename, t.lineno)
    # Don't alter lexer: default behaviour is to pass error production
    # up until it hits the catch-all at declaration, at which point
    # parsing continues (synchronisation).


def p_pragma(p):
    """ pragma : pragma_pack
               | PRAGMA pragma_directive_list PRAGMA_END
    """


def p_pragma_pack(p):
    """ pragma_pack : PRAGMA PRAGMA_PACK LPAREN RPAREN PRAGMA_END
                    | PRAGMA PRAGMA_PACK LPAREN constant RPAREN PRAGMA_END
                    | PRAGMA PRAGMA_PACK LPAREN pragma_pack_stack_args RPAREN PRAGMA_END
    """

    err = None
    if len(p) == 6:
        cdeclarations.pragma_pack.set_default()
    elif isinstance(p[4], tuple):
        op, id, n = p[4]
        if op == "push":
            err = cdeclarations.pragma_pack.push(id, n)
        elif op == "pop":
            err = cdeclarations.pragma_pack.pop(id)
        else:
            err = "Syntax error for #pragma pack at {}:{}".format(
                p.slice[1].filename, p.slice[1].lineno
            )
    else:
        cdeclarations.pragma_pack.current = p[4]

    if err:
        p.lexer.cparser.handle_error(err, p.slice[1].filename, p.slice[1].lineno)


def p_pragma_pack_stack_args(p):
    """ pragma_pack_stack_args : IDENTIFIER
                               | IDENTIFIER COMMA IDENTIFIER
                               | IDENTIFIER COMMA IDENTIFIER COMMA constant
                               | IDENTIFIER COMMA constant COMMA IDENTIFIER
                               | IDENTIFIER COMMA constant
    """
    op, id, n = p[1], None, None

    if len(p) > 2:
        if isinstance(p[3], expressions.ConstantExpressionNode):
            n = p[3].value

            if len(p) > 4:
                id = p[5]
        else:
            id = p[3]

            if len(p) > 4:
                n = p[5].value

    p[0] = (op, id, n)


def p_pragma_directive_list(p):
    """ pragma_directive_list : pragma_directive
                              | pragma_directive_list pragma_directive
    """
    if len(p) == 3:
        p[0] = p[1] + (p[2],)
    else:
        p[0] = (p[1],)


def p_pragma_directive(p):
    """ pragma_directive : IDENTIFIER
                         | string_literal
    """
    p[0] = p[1]


def main():
    yacc.yacc(tabmodule="new_parsetab")


================================================
FILE: ctypesgen/parser/cparser.py
================================================
"""
Parse a C source file.

To use, subclass CParser and override its handle_* methods.  Then instantiate
the class with a string to parse.
"""

__docformat__ = "restructuredtext"

import os.path
import sys

from ctypesgen.parser import cgrammar, preprocessor, yacc

# --------------------------------------------------------------------------
# Lexer
# --------------------------------------------------------------------------


class CLexer(object):
    def __init__(self, cparser):
        self.cparser = cparser
        self.type_names = set()
        self.in_define = False
        self.lineno = -1
        self.lexpos = -1

    def input(self, tokens):
        self.tokens = tokens
        self.pos = 0

    def token(self):
        while self.pos < len(self.tokens):
            t = self.tokens[self.pos]

            self.pos += 1

            if not t:
                break

            if t.type == "PP_DEFINE":
                self.in_define = True
            elif t.type == "PP_END_DEFINE":
                self.in_define = False

            # Transform PP tokens into C tokens
            elif t.type == "IDENTIFIER" and t.value in cgrammar.keywords:
                t.type = cgrammar.keyword_map[t.value]
            elif t.type == "IDENTIFIER" and t.value in self.type_names:
                if self.pos < 2 or self.tokens[self.pos - 2].type not in (
                    "VOID",
                    "_BOOL",
                    "CHAR",
                    "SHORT",
                    "INT",
                    "LONG",
                    "FLOAT",
                    "DOUBLE",
                    "SIGNED",
                    "UNSIGNED",
                    "ENUM",
                    "STRUCT",
                    "UNION",
                    "TYPE_NAME",
                ):
                    t.type = "TYPE_NAME"

            t.lexer = self
            t.clexpos = self.pos - 1

            return t
        return None


# --------------------------------------------------------------------------
# Parser
# --------------------------------------------------------------------------


class CParser(object):
    """Parse a C source file.

    Subclass and override the handle_* methods.  Call `parse` with a string
    to parse.
    """

    def __init__(self, options):
        super(CParser, self).__init__()
        self.preprocessor_parser = preprocessor.PreprocessorParser(options, self)
        self.parser = yacc.yacc(
            method="LALR",
            debug=False,
            module=cgrammar,
            write_tables=True,
            outputdir=os.path.dirname(__file__),
            optimize=True,
        )

        self.parser.errorfunc = cgrammar.p_error
        self.parser.cparser = self

        self.lexer = CLexer(self)
        if not options.no_stddef_types:
            self.lexer.type_names.add("wchar_t")
            self.lexer.type_names.add("ptrdiff_t")
            self.lexer.type_names.add("size_t")
        if not options.no_gnu_types:
            self.lexer.type_names.add("__builtin_va_list")
        if sys.platform == "win32" and not options.no_python_types:
            self.lexer.type_names.add("__int64")

    def parse(self, filename, debug=False):
        """Parse a file.

        If `debug` is True, parsing state is dumped to stdout.
        """

        self.handle_status("Preprocessing %s" % filename)
        self.preprocessor_parser.parse(filename)
        self.lexer.input(self.preprocessor_parser.output)
        self.handle_status("Parsing %s" % filename)
        self.parser.parse(lexer=self.lexer, debug=debug, tracking=True)

    # ----------------------------------------------------------------------
    # Parser interface.  Override these methods in your subclass.
    # ----------------------------------------------------------------------

    def handle_error(self, message, filename, lineno):
        """A parse error occurred.

        The default implementation prints `lineno` and `message` to stderr.
        The parser will try to recover from errors by synchronising at the
        next semicolon.
        """
        sys.stderr.write("%s:%s %s\n" % (filename, lineno, message))

    def handle_pp_error(self, message):
        """The C preprocessor emitted an error.

        The default implementation prints the error to stderr. If processing
        can continue, it will.
        """
        sys.stderr.write("Preprocessor: {}\n".format(message))

    def handle_status(self, message):
        """Progress information.

        The default implementationg prints message to stderr.
        """
        sys.stderr.write("{}\n".format(message))

    def handle_define(self, name, params, value, filename, lineno):
        """#define `name` `value`
        or #define `name`(`params`) `value`

        name is a string
        params is None or a list of strings
        value is a ...?
        """

    def handle_define_constant(self, name, value, filename, lineno):
        """#define `name` `value`

        name is a string
        value is an ExpressionNode or None
        """

    def handle_define_macro(self, name, params, value, filename, lineno):
        """#define `name`(`params`) `value`

        name is a string
        params is a list of strings
        value is an ExpressionNode or None
        """

    def handle_undefine(self, name, filename, lineno):
        """#undef `name`

        name is a string
        """

    def impl_handle_declaration(self, declaration, filename, lineno):
        """Internal method that calls `handle_declaration`.  This method
        also adds any new type definitions to the lexer's list of valid type
        names, which affects the parsing of subsequent declarations.
        """
        if declaration.storage == "typedef":
            declarator = declaration.declarator
            if not declarator:
                # XXX TEMPORARY while struct etc not filled
                return
            while declarator.pointer:
                declarator = declarator.pointer
            self.lexer.type_names.add(declarator.identifier)
        self.handle_declaration(declaration, filename, lineno)

    def handle_declaration(self, declaration, filename, lineno):
        """A declaration was encountered.

        `declaration` is an instance of Declaration.  Where a declaration has
        multiple initialisers, each is returned as a separate declaration.
        """
        pass


class DebugCParser(CParser):
    """A convenience class that prints each invocation of a handle_* method to
    stdout.
    """

    def handle_define(self, name, value, filename, lineno):
        print("#define name=%r, value=%r" % (name, value))

    def handle_define_constant(self, name, value, filename, lineno):
        print("#define constant name=%r, value=%r" % (name, value))

    def handle_declaration(self, declaration, filename, lineno):
        print(declaration)

    def get_ctypes_type(self, typ, declarator):
        return typ

    def handle_define_unparseable(self, name, params, value, filename, lineno):
        if params:
            original_string = "#define %s(%s) %s" % (name, ",".join(params), " ".join(value))
        else:
            original_string = "#define %s %s" % (name, " ".join(value))
        print(original_string)


if __name__ == "__main__":
    DebugCParser().parse(sys.argv[1], debug=True)


================================================
FILE: ctypesgen/parser/ctypesparser.py
================================================
"""
ctypesgen.parser.ctypesparser contains a class, CtypesParser, which is a
subclass of ctypesgen.parser.cparser.CParser. CtypesParser overrides the
handle_declaration() method of CParser. It turns the low-level type declarations
produced by CParser into CtypesType instances and breaks the parser's general
declarations into function, variable, typedef, constant, and type descriptions.
"""

__docformat__ = "restructuredtext"

__all__ = ["CtypesParser"]

from ctypesgen.ctypedescs import (
    CtypesArray,
    CtypesBitfield,
    CtypesEnum,
    CtypesFunction,
    CtypesPointer,
    CtypesSimple,
    CtypesSpecial,
    CtypesStruct,
    CtypesTypedef,
    ctypes_type_map,
    ctypes_type_map_python_builtin,
    remove_function_pointer,
)
from ctypesgen.expressions import (
    BinaryExpressionNode,
    ConstantExpressionNode,
    IdentifierExpressionNode,
)
from ctypesgen.parser.cdeclarations import (
    Attrib,
    EnumSpecifier,
    Pointer,
    StructTypeSpecifier,
)
from ctypesgen.parser.cparser import CParser


def make_enum_from_specifier(specifier):
    tag = specifier.tag

    enumerators = []
    last_name = None
    for e in specifier.enumerators:
        if e.expression:
            value = e.expression
        else:
            if last_name:
                value = BinaryExpressionNode(
                    "addition",
                    (lambda x, y: x + y),
                    "(%s + %s)",
                    (False, False),
                    IdentifierExpressionNode(last_name),
                    ConstantExpressionNode(1),
                )
            else:
                value = ConstantExpressionNode(0)

        enumerators.append((e.name, value))
        last_name = e.name

    return CtypesEnum(tag, enumerators, src=(specifier.filename, specifier.lineno))


def get_decl_id(decl):
    """Return the identifier of a given declarator"""
    while isinstance(decl, Pointer):
        decl = decl.pointer
    p_name = ""
    if decl is not None and decl.identifier is not None:
        p_name = decl.identifier
    return p_name


class CtypesParser(CParser):
    """Parse a C file for declarations that can be used by ctypes.

    Subclass and override the handle_ctypes_* methods.
    """

    def __init__(self, options):
        super(CtypesParser, self).__init__(options)
        self.type_map = ctypes_type_map
        if not options.no_python_types:
            self.type_map.update(ctypes_type_map_python_builtin)

    def make_struct_from_specifier(self, specifier):
        variety = {True: "union", False: "struct"}[specifier.is_union]
        tag = specifier.tag

        if specifier.declarations:
            members = []
            for declaration in specifier.declarations:
                t = self.get_ctypes_type(
                    declaration.type, declaration.declarator, check_qualifiers=True
                )
                declarator = declaration.declarator
                if declarator is None:
                    # Anonymous field in nested union/struct (C11/GCC).
                    name = None
                else:
                    while declarator.pointer:
                        declarator = declarator.pointer
                    name = declarator.identifier
                members.append((name, remove_function_pointer(t)))
        else:
            members = None

        return CtypesStruct(
            tag, specifier.attrib, variety, members, src=(specifier.filename, specifier.lineno)
        )

    def get_ctypes_type(self, typ, declarator, check_qualifiers=False):
        signed = True
        typename = "int"
        longs = 0
        t = None

        for specifier in typ.specifiers:
            if isinstance(specifier, StructTypeSpecifier):
                t = self.make_struct_from_specifier(specifier)
            elif isinstance(specifier, EnumSpecifier):
                t = make_enum_from_specifier(specifier)
            elif specifier == "signed":
                signed = True
            elif specifier == "unsigned":
                signed = False
            elif specifier == "long":
                longs += 1
            elif specifier == "short":
                longs = -1
            else:
                typename = str(specifier)

        if not t:
            # It is a numeric type of some sort
            if (typename, signed, longs) in self.type_map:
                t = CtypesSimple(typename, signed, longs)

            elif signed and not longs:
                t = CtypesTypedef(typename)

            else:
                name = " ".join(typ.specifiers)
                if typename in [x[0] for x in self.type_map.keys()]:
                    # It's an unsupported variant of a builtin type
                    error = 'Ctypes does not support the type "%s".' % name
                else:
                    error = (
                        "Ctypes does not support adding additional "
                        'specifiers to typedefs, such as "%s"' % name
                    )
                t = CtypesTypedef(name)
                t.error(error, cls="unsupported-type")

            if declarator and declarator.bitfield:
                t = CtypesBitfield(t, declarator.bitfield)

        qualifiers = []
        qualifiers.extend(typ.qualifiers)
        while declarator and declarator.pointer:
            if declarator.parameters is not None:
                variadic = "..." in declarator.parameters

                params = []
                for param in declarator.parameters:
                    if param == "...":
                        break
                    param_name = get_decl_id(param.declarator)
                    ct = self.get_ctypes_type(param.type, param.declarator)
                    ct.identifier = param_name
                    params.append(ct)
                t = CtypesFunction(t, params, variadic)

            a = declarator.array
            while a:
                t = CtypesArray(t, a.size)
                a = a.array

            qualifiers.extend(declarator.qualifiers)

            t = CtypesPointer(t, tuple(typ.qualifiers) + tuple(declarator.qualifiers))

            declarator = declarator.pointer

        if declarator and declarator.parameters is not None:
            variadic = "..." in declarator.parameters

            params = []
            for param in declarator.parameters:
                if param == "...":
                    break
                param_name = get_decl_id(param.declarator)
                ct = self.get_ctypes_type(param.type, param.declarator)
                ct.identifier = param_name
                params.append(ct)
            t = CtypesFunction(t, params, variadic, declarator.attrib)

        if declarator:
            a = declarator.array
            while a:
                t = CtypesArray(t, a.size)
                a = a.array

        if (
            isinstance(t, CtypesPointer)
            and isinstance(t.destination, CtypesSimple)
            and t.destination.name == "char"
            and t.destination.signed
        ):
            t = CtypesSpecial("String")

        return t

    def handle_declaration(self, declaration, filename, lineno):
        t = self.get_ctypes_type(declaration.type, declaration.declarator)

        if type(t) in (CtypesStruct, CtypesEnum):
            self.handle_ctypes_new_type(remove_function_pointer(t), filename, lineno)

        declarator = declaration.declarator
        if declarator is None:
            # XXX TEMPORARY while struct with no typedef not filled in
            return
        while declarator.pointer:
            declarator = declarator.pointer
        name = declarator.identifier
        if declaration.storage == "typedef":
            self.handle_ctypes_typedef(name, remove_function_pointer(t), filename, lineno)
        elif type(t) == CtypesFunction:
            attrib = Attrib(t.attrib)
            attrib.update(declaration.attrib)
            self.handle_ctypes_function(
                name, t.restype, t.argtypes, t.errcheck, t.variadic, attrib, filename, lineno
            )
        elif declaration.storage != "static":
            self.handle_ctypes_variable(name, t, filename, lineno)

    # ctypes parser interface.  Override these methods in your subclass.

    def handle_ctypes_new_type(self, ctype, filename, lineno):
        pass

    def handle_ctypes_typedef(self, name, ctype, filename, lineno):
        pass

    def handle_ctypes_function(
        self, name, restype, argtypes, errcheck, variadic, attrib, filename, lineno
    ):
        pass

    def handle_ctypes_variable(self, name, ctype, filename, lineno):
        pass


================================================
FILE: ctypesgen/parser/datacollectingparser.py
================================================
"""
DataCollectingParser subclasses ctypesparser.CtypesParser and builds Description
objects from the CtypesType objects and other information from CtypesParser.
After parsing is complete, a DescriptionCollection object can be retrieved by
calling DataCollectingParser.data().
"""

import os
from tempfile import mkstemp

from ctypesgen.ctypedescs import CtypesEnum, CtypesType, CtypesTypeVisitor
from ctypesgen.descriptions import (
    ConstantDescription,
    DescriptionCollection,
    EnumDescription,
    FunctionDescription,
    MacroDescription,
    StructDescription,
    TypedefDescription,
    UndefDescription,
    VariableDescription,
)
from ctypesgen.expressions import ConstantExpressionNode
from ctypesgen.messages import error_message, status_message
from ctypesgen.parser import ctypesparser


class DataCollectingParser(ctypesparser.CtypesParser, CtypesTypeVisitor):
    """Main class for the Parser component. Steps for use:
    p=DataCollectingParser(names_of_header_files,options)
    p.parse()
    data=p.data() #A dictionary of constants, enums, structs, functions, etc.
    """

    def __init__(self, headers, options):
        super(DataCollectingParser, self).__init__(options)
        self.headers = headers
        self.options = options

        self.constants = []
        self.typedefs = []
        self.structs = []
        self.enums = []
        self.functions = []
        self.variables = []
        self.macros = []

        self.all = []
        self.output_order = []

        # NULL is a useful macro to have defined
        null = ConstantExpressionNode(None)
        nullmacro = ConstantDescription("NULL", null, ("<built-in>", 1))
        self.constants.append(nullmacro)
        self.all.append(nullmacro)
        self.output_order.append(("constant", nullmacro))

        # A list of tuples describing macros; saved to be processed after
        # everything else has been parsed
        self.saved_macros = []
        # A set of structs that are already known
        self.already_seen_structs = set()
        # A dict of structs that have only been seen in opaque form
        self.already_seen_opaque_structs = {}
        # A set of enums that are already known
        self.already_seen_enums = set()
        # A dict of enums that have only been seen in opaque form
        self.already_seen_opaque_enums = {}

    def parse(self):
        fd, fname = mkstemp(suffix=".h")
        with os.fdopen(fd, "w") as f:
            for header in self.options.other_headers:
                f.write("#include <%s>\n" % header)
            for header in self.headers:
                f.write('#include "%s"\n' % os.path.abspath(header))
            f.flush()
        try:
            super(DataCollectingParser, self).parse(fname, self.options.debug_level)
        finally:
            os.unlink(fname)

        for name, params, expr, (filename, lineno) in self.saved_macros:
            self.handle_macro(name, params, expr, filename, lineno)

    def handle_define_constant(self, name, expr, filename, lineno):
        # Called by CParser
        # Save to handle later
        self.saved_macros.append((name, None, expr, (filename, lineno)))

    def handle_define_unparseable(self, name, params, value, filename, lineno):
        # Called by CParser
        if params:
            original_string = "#define %s(%s) %s" % (name, ",".join(params), " ".join(value))
        else:
            original_string = "#define %s %s" % (name, " ".join(value))
        macro = MacroDescription(name, params, None, src=(filename, lineno))
        macro.error('Could not parse macro "%s"' % original_string, cls="macro")
        macro.original_string = original_string
        self.macros.append(macro)
        self.all.append(macro)
        self.output_order.append(("macro", macro))

    def handle_define_macro(self, name, params, expr, filename, lineno):
        # Called by CParser
        # Save to handle later
        self.saved_macros.append((name, params, expr, (filename, lineno)))

    def handle_undefine(self, macro, filename, lineno):
        # save to handle later to get order correct
        self.saved_macros.append(("#undef", None, macro, (filename, lineno)))

    def handle_ctypes_typedef(self, name, ctype, filename, lineno):
        # Called by CtypesParser
        ctype.visit(self)

        typedef = TypedefDescription(name, ctype, src=(filename, repr(lineno)))

        self.typedefs.append(typedef)
        self.all.append(typedef)
        self.output_order.append(("typedef", typedef))

    def handle_ctypes_new_type(self, ctype, filename, lineno):
        # Called by CtypesParser
        if isinstance(ctype, CtypesEnum):
            self.handle_enum(ctype, filename, lineno)
        else:
            self.handle_struct(ctype, filename, lineno)

    def handle_ctypes_function(
        self, name, restype, argtypes, errcheck, variadic, attrib, filename, lineno
    ):
        # Called by CtypesParser
        restype.visit(self)
        for argtype in argtypes:
            argtype.visit(self)

        function = FunctionDescription(
            name, restype, argtypes, errcheck, variadic, attrib, src=(filename, repr(lineno))
        )

        self.functions.append(function)
        self.all.append(function)
        self.output_order.append(("function", function))

    def handle_ctypes_variable(self, name, ctype, filename, lineno):
        # Called by CtypesParser
        ctype.visit(self)

        variable = VariableDescription(name, ctype, src=(filename, repr(lineno)))

        self.variables.append(variable)
        self.all.append(variable)
        self.output_order.append(("variable", variable))

    def handle_struct(self, ctypestruct, filename, lineno):
        # Called from within DataCollectingParser

        # When we find an opaque struct, we make a StructDescription for it
        # and record it in self.already_seen_opaque_structs. If we later
        # find a transparent struct with the same tag, we fill in the
        # opaque struct with the information from the transparent struct and
        # move the opaque struct to the end of the struct list.

        name = "%s %s" % (ctypestruct.variety, ctypestruct.tag)

        if name in self.already_seen_structs:
            return

        if ctypestruct.opaque:
            if name not in self.already_seen_opaque_structs:
                struct = StructDescription(
                    ctypestruct.tag,
                    ctypestruct.attrib,
                    ctypestruct.variety,
                    None,  # No members
                    True,  # Opaque
                    ctypestruct,
                    src=(filename, str(lineno)),
                )

                self.already_seen_opaque_structs[name] = struct
                self.structs.append(struct)
                self.all.append(struct)
                self.output_order.append(("struct", struct))

        else:
            for membername, ctype in ctypestruct.members:
                ctype.visit(self)

            if name in self.already_seen_opaque_structs:
                # Fill in older version
                struct = self.already_seen_opaque_structs[name]
                struct.opaque = False
                struct.members = ctypestruct.members
                struct.ctype = ctypestruct
                struct.src = ctypestruct.src

                self.output_order.append(("struct-body", struct))

                del self.already_seen_opaque_structs[name]

            else:
                struct = StructDescription(
                    ctypestruct.tag,
                    ctypestruct.attrib,
                    ctypestruct.variety,
                    ctypestruct.members,
                    False,  # Not opaque
                    src=(filename, str(lineno)),
                    ctype=ctypestruct,
                )
                self.structs.append(struct)
                self.all.append(struct)
                self.output_order.append(("struct", struct))
                self.output_order.append(("struct-body", struct))

            self.already_seen_structs.add(name)

    def handle_enum(self, ctypeenum, filename, lineno):
        # Called from within DataCollectingParser.

        # Process for handling opaque enums is the same as process for opaque
        # structs. See handle_struct() for more details.

        tag = ctypeenum.tag
        if tag in self.already_seen_enums:
            return

        if ctypeenum.opaque:
            if tag not in self.already_seen_opaque_enums:
                enum = EnumDescription(ctypeenum.tag, None, ctypeenum, src=(filename, str(lineno)))
                enum.opaque = True

                self.already_seen_opaque_enums[tag] = enum
                self.enums.append(enum)
                self.all.append(enum)
                self.output_order.append(("enum", enum))

        else:
            if tag in self.already_seen_opaque_enums:
                # Fill in older opaque version
                enum = self.already_seen_opaque_enums[tag]
                enum.opaque = False
                enum.ctype = ctypeenum
                enum.src = ctypeenum.src
                enum.members = ctypeenum.enumerators

                del self.already_seen_opaque_enums[tag]

            else:
                enum = EnumDescription(
                    ctypeenum.tag,
                    ctypeenum.enumerators,
                    src=(filename, str(lineno)),
                    ctype=ctypeenum,
                )
                enum.opaque = False

                self.enums.append(enum)
                self.all.append(enum)
                self.output_order.append(("enum", enum))

            self.already_seen_enums.add(tag)

            for enumname, expr in ctypeenum.enumerators:
                constant = ConstantDescription(enumname, expr, src=(filename, lineno))

                self.constants.append(constant)
                self.all.append(constant)
                self.output_order.append(("constant", constant))

    def handle_macro(self, name, params, expr, filename, lineno):
        # Called from within DataCollectingParser
        src = (filename, lineno)

        if expr is None:
            expr = ConstantExpressionNode(True)
            constant = ConstantDescription(name, expr, src)
            self.constants.append(constant)
            self.all.append(constant)
            return

        expr.visit(self)

        if isinstance(expr, CtypesType):
            if params:
                macro = MacroDescription(name, "", src)
                macro.error(
                    "%s has parameters but evaluates to a type. "
                    "Ctypesgen does not support it." % macro.casual_name(),
                    cls="macro",
                )
                self.macros.append(macro)
                self.all.append(macro)
                self.output_order.append(("macro", macro))

            else:
                typedef = TypedefDescription(name, expr, src)
                self.typedefs.append(typedef)
                self.all.append(typedef)
                self.output_order.append(("typedef", typedef))

        elif name == "#undef":
            undef = UndefDescription(expr, src)
            self.all.append(undef)
            self.output_order.append(("undef", undef))
        else:
            macro = MacroDescription(name, params, expr, src)
            self.macros.append(macro)
            self.all.append(macro)
            self.output_order.append(("macro", macro))

        # Macros could possibly contain things like __FILE__, __LINE__, etc...
        # This could be supported, but it would be a lot of work. It would
        # probably also bloat the Preamble considerably.

    def handle_error(self, message, filename, lineno):
        # Called by CParser
        error_message("%s:%d: %s" % (filename, lineno, message), cls="cparser")

    def handle_pp_error(self, message):
        # Called by PreprocessorParser
        error_message("%s: %s" % (self.options.cpp, message), cls="cparser")

    def handle_status(self, message):
        # Called by CParser
        status_message(message)

    def visit_struct(self, struct):
        self.handle_struct(struct, struct.src[0], struct.src[1])

    def visit_enum(self, enum):
        self.handle_enum(enum, enum.src[0], enum.src[1])

    def data(self):
        return DescriptionCollection(
            self.constants,
            self.typedefs,
            self.structs,
            self.enums,
            self.functions,
            self.variables,
            self.macros,
            self.all,
            self.output_order,
        )


================================================
FILE: ctypesgen/parser/lex.py
================================================
# -----------------------------------------------------------------------------
# ply: lex.py
#
# Copyright (C) 2001-2018
# David M. Beazley (Dabeaz LLC)
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:
#
# * Redistributions of source code must retain the above copyright notice,
#   this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright notice,
#   this list of conditions and the following disclaimer in the documentation
#   and/or other materials provided with the distribution.
# * Neither the name of the David Beazley or Dabeaz LLC may be used to
#   endorse or promote products derived from this software without
#  specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# -----------------------------------------------------------------------------

__version__    = '3.11'
__tabversion__ = '3.10'

import re
import sys
import types
import copy
import os
import inspect

# This tuple contains known string types
try:
    # Python 2.6
    StringTypes = (types.StringType, types.UnicodeType)
except AttributeError:
    # Python 3.0
    StringTypes = (str, bytes)

# This regular expression is used to match valid token names
_is_identifier = re.compile(r'^[a-zA-Z0-9_]+$')

# Exception thrown when invalid token encountered and no default error
# handler is defined.
class LexError(Exception):
    def __init__(self, message, s):
        self.args = (message,)
        self.text = s


# Token class.  This class is used to represent the tokens produced.
class LexToken(object):
    def __str__(self):
        return 'LexToken(%s,%r,%d,%d)' % (self.type, self.value, self.lineno, self.lexpos)

    def __repr__(self):
        return str(self)


# This object is a stand-in for a logging object created by the
# logging module.

class PlyLogger(object):
    def __init__(self, f):
        self.f = f

    def critical(self, msg, *args, **kwargs):
        self.f.write((msg % args) + '\n')

    def warning(self, msg, *args, **kwargs):
        self.f.write('WARNING: ' + (msg % args) + '\n')

    def error(self, msg, *args, **kwargs):
        self.f.write('ERROR: ' + (msg % args) + '\n')

    info = critical
    debug = critical


# Null logger is used when no output is generated. Does nothing.
class NullLogger(object):
    def __getattribute__(self, name):
        return self

    def __call__(self, *args, **kwargs):
        return self


# -----------------------------------------------------------------------------
#                        === Lexing Engine ===
#
# The following Lexer class implements the lexer runtime.   There are only
# a few public methods and attributes:
#
#    input()          -  Store a new string in the lexer
#    token()          -  Get the next token
#    clone()          -  Clone the lexer
#
#    lineno           -  Current line number
#    lexpos           -  Current position in the input string
# -----------------------------------------------------------------------------

class Lexer:
    def __init__(self):
        self.lexre = None             # Master regular expression. This is a list of
                                      # tuples (re, findex) where re is a compiled
                                      # regular expression and findex is a list
                                      # mapping regex group numbers to rules
        self.lexretext = None         # Current regular expression strings
        self.lexstatere = {}          # Dictionary mapping lexer states to master regexs
        self.lexstateretext = {}      # Dictionary mapping lexer states to regex strings
        self.lexstaterenames = {}     # Dictionary mapping lexer states to symbol names
        self.lexstate = 'INITIAL'     # Current lexer state
        self.lexstatestack = []       # Stack of lexer states
        self.lexstateinfo = None      # State information
        self.lexstateignore = {}      # Dictionary of ignored characters for each state
        self.lexstateerrorf = {}      # Dictionary of error functions for each state
        self.lexstateeoff = {}        # Dictionary of eof functions for each state
        self.lexreflags = 0           # Optional re compile flags
        self.lexdata = None           # Actual input data (as a string)
        self.lexpos = 0               # Current position in input text
        self.lexlen = 0               # Length of the input text
        self.lexerrorf = None         # Error rule (if any)
        self.lexeoff = None           # EOF rule (if any)
        self.lextokens = None         # List of valid tokens
        self.lexignore = ''           # Ignored characters
        self.lexliterals = ''         # Literal characters that can be passed through
        self.lexmodule = None         # Module
        self.lineno = 1               # Current line number
        self.lexoptimize = False      # Optimized mode

    def clone(self, object=None):
        c = copy.copy(self)

        # If the object parameter has been supplied, it means we are attaching the
        # lexer to a new object.  In this case, we have to rebind all methods in
        # the lexstatere and lexstateerrorf tables.

        if object:
            newtab = {}
            for key, ritem in self.lexstatere.items():
                newre = []
                for cre, findex in ritem:
                    newfindex = []
                    for f in findex:
                        if not f or not f[0]:
                            newfindex.append(f)
                            continue
                        newfindex.append((getattr(object, f[0].__name__), f[1]))
                newre.append((cre, newfindex))
                newtab[key] = newre
            c.lexstatere = newtab
            c.lexstateerrorf = {}
            for key, ef in self.lexstateerrorf.items():
                c.lexstateerrorf[key] = getattr(object, ef.__name__)
            c.lexmodule = object
        return c

    # ------------------------------------------------------------
    # writetab() - Write lexer information to a table file
    # ------------------------------------------------------------
    def writetab(self, lextab, outputdir=''):
        if isinstance(lextab, types.ModuleType):
            raise IOError("Won't overwrite existing lextab module")
        basetabmodule = lextab.split('.')[-1]
        filename = os.path.join(outputdir, basetabmodule) + '.py'
        with open(filename, 'w') as tf:
            tf.write('# %s.py. This file automatically created by PLY (version %s). Don\'t edit!\n' % (basetabmodule, __version__))
            tf.write('_tabversion   = %s\n' % repr(__tabversion__))
            tf.write('_lextokens    = set(%s)\n' % repr(tuple(sorted(self.lextokens))))
            tf.write('_lexreflags   = %s\n' % repr(int(self.lexreflags)))
            tf.write('_lexliterals  = %s\n' % repr(self.lexliterals))
            tf.write('_lexstateinfo = %s\n' % repr(self.lexstateinfo))

            # Rewrite the lexstatere table, replacing function objects with function names
            tabre = {}
            for statename, lre in self.lexstatere.items():
                titem = []
                for (pat, func), retext, renames in zip(lre, self.lexstateretext[statename], self.lexstaterenames[statename]):
                    titem.append((retext, _funcs_to_names(func, renames)))
                tabre[statename] = titem

            tf.write('_lexstatere   = %s\n' % repr(tabre))
            tf.write('_lexstateignore = %s\n' % repr(self.lexstateignore))

            taberr = {}
            for statename, ef in self.lexstateerrorf.items():
                taberr[statename] = ef.__name__ if ef else None
            tf.write('_lexstateerrorf = %s\n' % repr(taberr))

            tabeof = {}
            for statename, ef in self.lexstateeoff.items():
                tabeof[statename] = ef.__name__ if ef else None
            tf.write('_lexstateeoff = %s\n' % repr(tabeof))

    # ------------------------------------------------------------
    # readtab() - Read lexer information from a tab file
    # ------------------------------------------------------------
    def readtab(self, tabfile, fdict):
        if isinstance(tabfile, types.ModuleType):
            lextab = tabfile
        else:
            exec('import %s' % tabfile)
            lextab = sys.modules[tabfile]

        if getattr(lextab, '_tabversion', '0.0') != __tabversion__:
            raise ImportError('Inconsistent PLY version')

        self.lextokens      = lextab._lextokens
        self.lexreflags     = lextab._lexreflags
        self.lexliterals    = lextab._lexliterals
        self.lextokens_all  = self.lextokens | set(self.lexliterals)
        self.lexstateinfo   = lextab._lexstateinfo
        self.lexstateignore = lextab._lexstateignore
        self.lexstatere     = {}
        self.lexstateretext = {}
        for statename, lre in lextab._lexstatere.items():
            titem = []
            txtitem = []
            for pat, func_name in lre:
                titem.append((re.compile(pat, lextab._lexreflags), _names_to_funcs(func_name, fdict)))

            self.lexstatere[statename] = titem
            self.lexstateretext[statename] = txtitem

        self.lexstateerrorf = {}
        for statename, ef in lextab._lexstateerrorf.items():
            self.lexstateerrorf[statename] = fdict[ef]

        self.lexstateeoff = {}
        for statename, ef in lextab._lexstateeoff.items():
            self.lexstateeoff[statename] = fdict[ef]

        self.begin('INITIAL')

    # ------------------------------------------------------------
    # input() - Push a new string into the lexer
    # ------------------------------------------------------------
    def input(self, s):
        # Pull off the first character to see if s looks like a string
        c = s[:1]
        if not isinstance(c, StringTypes):
            raise ValueError('Expected a string')
        self.lexdata = s
        self.lexpos = 0
        self.lexlen = len(s)

    # ------------------------------------------------------------
    # begin() - Changes the lexing state
    # ------------------------------------------------------------
    def begin(self, state):
        if state not in self.lexstatere:
            raise ValueError('Undefined state')
        self.lexre = self.lexstatere[state]
        self.lexretext = self.lexstateretext[state]
        self.lexignore = self.lexstateignore.get(state, '')
        self.lexerrorf = self.lexstateerrorf.get(state, None)
        self.lexeoff = self.lexstateeoff.get(state, None)
        self.lexstate = state

    # ------------------------------------------------------------
    # push_state() - Changes the lexing state and saves old on stack
    # ------------------------------------------------------------
    def push_state(self, state):
        self.lexstatestack.append(self.lexstate)
        self.begin(state)

    # ------------------------------------------------------------
    # pop_state() - Restores the previous state
    # ------------------------------------------------------------
    def pop_state(self):
        self.begin(self.lexstatestack.pop())

    # ------------------------------------------------------------
    # current_state() - Returns the current lexing state
    # ------------------------------------------------------------
    def current_state(self):
        return self.lexstate

    # ------------------------------------------------------------
    # skip() - Skip ahead n characters
    # ------------------------------------------------------------
    def skip(self, n):
        self.lexpos += n

    # ------------------------------------------------------------
    # opttoken() - Return the next token from the Lexer
    #
    # Note: This function has been carefully implemented to be as fast
    # as possible.  Don't make changes unless you really know what
    # you are doing
    # ------------------------------------------------------------
    def token(self):
        # Make local copies of frequently referenced attributes
        lexpos    = self.lexpos
        lexlen    = self.lexlen
        lexignore = self.lexignore
        lexdata   = self.lexdata

        while lexpos < lexlen:
            # This code provides some short-circuit code for whitespace, tabs, and other ignored characters
            if lexdata[lexpos] in lexignore:
                lexpos += 1
                continue

            # Look for a regular expression match
            for lexre, lexindexfunc in self.lexre:
                m = lexre.match(lexdata, lexpos)
                if not m:
                    continue

                # Create a token for return
                tok = LexToken()
                tok.value = m.group()
                tok.lineno = self.lineno
                tok.lexpos = lexpos

                i = m.lastindex
                func, tok.type = lexindexfunc[i]

                if not func:
                    # If no token type was set, it's an ignored token
                    if tok.type:
                        self.lexpos = m.end()
                        return tok
                    else:
                        lexpos = m.end()
                        break

                lexpos = m.end()

                # If token is processed by a function, call it

                tok.lexer = self      # Set additional attributes useful in token rules
                self.lexmatch = m
                self.lexpos = lexpos

                newtok = func(tok)

                # Every function must return a token, if nothing, we just move to next token
                if not newtok:
                    lexpos    = self.lexpos         # This is here in case user has updated lexpos.
                    lexignore = self.lexignore      # This is here in case there was a state change
                    break

                # Verify type of the token.  If not in the token map, raise an error
                if not self.lexoptimize:
                    if newtok.type not in self.lextokens_all:
                        raise LexError("%s:%d: Rule '%s' returned an unknown token type '%s'" % (
                            func.__code__.co_filename, func.__code__.co_firstlineno,
                            func.__name__, newtok.type), lexdata[lexpos:])

                return newtok
            else:
                # No match, see if in literals
                if lexdata[lexpos] in self.lexliterals:
                    tok = LexToken()
                    tok.value = lexdata[lexpos]
                    tok.lineno = self.lineno
                    tok.type = tok.value
                    tok.lexpos = lexpos
                    self.lexpos = lexpos + 1
                    return tok

                # No match. Call t_error() if defined.
                if self.lexerrorf:
                    tok = LexToken()
                    tok.value = self.lexdata[lexpos:]
                    tok.lineno = self.lineno
                    tok.type = 'error'
                    tok.lexer = self
                    tok.lexpos = lexpos
                    self.lexpos = lexpos
                    newtok = self.lexerrorf(tok)
                    if lexpos == self.lexpos:
                        # Error method didn't change text position at all. This is an error.
                        raise LexError("Scanning error. Illegal character '%s'" % (lexdata[lexpos]), lexdata[lexpos:])
                    lexpos = self.lexpos
                    if not newtok:
                        continue
                    return newtok

                self.lexpos = lexpos
                raise LexError("Illegal character '%s' at index %d" % (lexdata[lexpos], lexpos), lexdata[lexpos:])

        if self.lexeoff:
            tok = LexToken()
            tok.type = 'eof'
            tok.value = ''
            tok.lineno = self.lineno
            tok.lexpos = lexpos
            tok.lexer = self
            self.lexpos = lexpos
            newtok = self.lexeoff(tok)
            return newtok

        self.lexpos = lexpos + 1
        if self.lexdata is None:
            raise RuntimeError('No input string given with input()')
        return None

    # Iterator interface
    def __iter__(self):
        return self

    def next(self):
        t = self.token()
        if t is None:
            raise StopIteration
        return t

    __next__ = next

# -----------------------------------------------------------------------------
#                           ==== Lex Builder ===
#
# The functions and classes below are used to collect lexing information
# and build a Lexer object from it.
# -----------------------------------------------------------------------------

# -----------------------------------------------------------------------------
# _get_regex(func)
#
# Returns the regular expression assigned to a function either as a doc string
# or as a .regex attribute attached by the @TOKEN decorator.
# -----------------------------------------------------------------------------
def _get_regex(func):
    return getattr(func, 'regex', func.__doc__)

# -----------------------------------------------------------------------------
# get_caller_module_dict()
#
# This function returns a dictionary containing all of the symbols defined within
# a caller further down the call stack.  This is used to get the environment
# associated with the yacc() call if none was provided.
# -----------------------------------------------------------------------------
def get_caller_module_dict(levels):
    f = sys._getframe(levels)
    ldict = f.f_globals.copy()
    if f.f_globals != f.f_locals:
        ldict.update(f.f_locals)
    return ldict

# -----------------------------------------------------------------------------
# _funcs_to_names()
#
# Given a list of regular expression functions, this converts it to a list
# suitable for output to a table file
# -----------------------------------------------------------------------------
def _funcs_to_names(funclist, namelist):
    result = []
    for f, name in zip(funclist, namelist):
        if f and f[0]:
            result.append((name, f[1]))
        else:
            result.append(f)
    return result

# -----------------------------------------------------------------------------
# _names_to_funcs()
#
# Given a list of regular expression function names, this converts it back to
# functions.
# -----------------------------------------------------------------------------
def _names_to_funcs(namelist, fdict):
    result = []
    for n in namelist:
        if n and n[0]:
            result.append((fdict[n[0]], n[1]))
        else:
            result.append(n)
    return result

# -----------------------------------------------------------------------------
# _form_master_re()
#
# This function takes a list of all of the regex components and attempts to
# form the master regular expression.  Given limitations in the Python re
# module, it may be necessary to break the master regex into separate expressions.
# -----------------------------------------------------------------------------
def _form_master_re(relist, reflags, ldict, toknames):
    if not relist:
        return []
    regex = '|'.join(relist)
    try:
        lexre = re.compile(regex, reflags)

        # Build the index to function map for the matching engine
        lexindexfunc = [None] * (max(lexre.groupindex.values()) + 1)
        lexindexnames = lexindexfunc[:]

        for f, i in lexre.groupindex.items():
            handle = ldict.get(f, None)
            if type(handle) in (types.FunctionType, types.MethodType):
                lexindexfunc[i] = (handle, toknames[f])
                lexindexnames[i] = f
            elif handle is not None:
                lexindexnames[i] = f
                if f.find('ignore_') > 0:
                    lexindexfunc[i] = (None, None)
                else:
                    lexindexfunc[i] = (None, toknames[f])

        return [(lexre, lexindexfunc)], [regex], [lexindexnames]
    except Exception:
        m = int(len(relist)/2)
        if m == 0:
            m = 1
        llist, lre, lnames = _form_master_re(relist[:m], reflags, ldict, toknames)
        rlist, rre, rnames = _form_master_re(relist[m:], reflags, ldict, toknames)
        return (llist+rlist), (lre+rre), (lnames+rnames)

# -----------------------------------------------------------------------------
# def _statetoken(s,names)
#
# Given a declaration name s of the form "t_" and a dictionary whose keys are
# state names, this function returns a tuple (states,tokenname) where states
# is a tuple of state names and tokenname is the name of the token.  For example,
# calling this with s = "t_foo_bar_SPAM" might return (('foo','bar'),'SPAM')
# -----------------------------------------------------------------------------
def _statetoken(s, names):
    parts = s.split('_')
    for i, part in enumerate(parts[1:], 1):
        if part not in names and part != 'ANY':
            break

    if i > 1:
        states = tuple(parts[1:i])
    else:
        states = ('INITIAL',)

    if 'ANY' in states:
        states = tuple(names)

    tokenname = '_'.join(parts[i:])
    return (states, tokenname)


# -----------------------------------------------------------------------------
# LexerReflect()
#
# This class represents information needed to build a lexer as extracted from a
# user's input file.
# -----------------------------------------------------------------------------
class LexerReflect(object):
    def __init__(self, ldict, log=None, reflags=0):
        self.ldict      = ldict
        self.error_func = None
        self.tokens     = []
        self.reflags    = reflags
        self.stateinfo  = {'INITIAL': 'inclusive'}
        self.modules    = set()
        self.error      = False
        self.log        = PlyLogger(sys.stderr) if log is None else log

    # Get all of the basic information
    def get_all(self):
        self.get_tokens()
        self.get_literals()
        self.get_states()
        self.get_rules()

    # Validate all of the information
    def validate_all(self):
        self.validate_tokens()
        self.validate_literals()
        self.validate_rules()
        return self.error

    # Get the tokens map
    def get_tokens(self):
        tokens = self.ldict.get('tokens', None)
        if not tokens:
            self.log.error('No token list is defined')
            self.error = True
            return

        if not isinstance(tokens, (list, tuple)):
            self.log.error('tokens must be a list or tuple')
            self.error = True
            return

        if not tokens:
            self.log.error('tokens is empty')
            self.error = True
            return

        self.tokens = tokens

    # Validate the tokens
    def validate_tokens(self):
        terminals = {}
        for n in self.tokens:
            if not _is_identifier.match(n):
                self.log.error("Bad token name '%s'", n)
                self.error = True
            if n in terminals:
                self.log.warning("Token '%s' multiply defined", n)
            terminals[n] = 1

    # Get the literals specifier
    def get_literals(self):
        self.literals = self.ldict.get('literals', '')
        if not self.literals:
            self.literals = ''

    # Validate literals
    def validate_literals(self):
        try:
            for c in self.literals:
                if not isinstance(c, StringTypes) or len(c) > 1:
                    self.log.error('Invalid literal %s. Must be a single character', repr(c))
                    self.error = True

        except TypeError:
            self.log.error('Invalid literals specification. literals must be a sequence of characters')
            self.error = True

    def get_states(self):
        self.states = self.ldict.get('states', None)
        # Build statemap
        if self.states:
            if not isinstance(self.states, (tuple, list)):
                self.log.error('states must be defined as a tuple or list')
                self.error = True
            else:
                for s in self.states:
                    if not isinstance(s, tuple) or len(s) != 2:
                        self.log.error("Invalid state specifier %s. Must be a tuple (statename,'exclusive|inclusive')", repr(s))
                        self.error = True
                        continue
                    name, statetype = s
                    if not isinstance(name, StringTypes):
                        self.log.error('State name %s must be a string', repr(name))
                        self.error = True
                        continue
                    if not (statetype == 'inclusive' or statetype == 'exclusive'):
                        self.log.error("State type for state %s must be 'inclusive' or 'exclusive'", name)
                        self.error = True
                        continue
                    if name in self.stateinfo:
                        self.log.error("State '%s' already defined", name)
                        self.error = True
                        continue
                    self.stateinfo[name] = statetype

    # Get all of the symbols with a t_ prefix and sort them into various
    # categories (functions, strings, error functions, and ignore characters)

    def get_rules(self):
        tsymbols = [f for f in self.ldict if f[:2] == 't_']

        # Now build up a list of functions and a list of strings
        self.toknames = {}        # Mapping of symbols to token names
        self.funcsym  = {}        # Symbols defined as functions
        self.strsym   = {}        # Symbols defined as strings
        self.ignore   = {}        # Ignore strings by state
        self.errorf   = {}        # Error functions by state
        self.eoff     = {}        # EOF functions by state

        for s in self.stateinfo:
            self.funcsym[s] = []
            self.strsym[s] = []

        if len(tsymbols) == 0:
            self.log.error('No rules of the form t_rulename are defined')
            self.error = True
            return

        for f in tsymbols:
            t = self.ldict[f]
            states, tokname = _statetoken(f, self.stateinfo)
            self.toknames[f] = tokname

            if hasattr(t, '__call__'):
                if tokname == 'error':
                    for s in states:
                        self.errorf[s] = t
                elif tokname == 'eof':
                    for s in states:
                        self.eoff[s] = t
                elif tokname == 'ignore':
                    line = t.__code__.co_firstlineno
                    file = t.__code__.co_filename
                    self.log.error("%s:%d: Rule '%s' must be defined as a string", file, line, t.__name__)
                    self.error = True
                else:
                    for s in states:
                        self.funcsym[s].append((f, t))
            elif isinstance(t, StringTypes):
                if tokname == 'ignore':
                    for s in states:
                        self.ignore[s] = t
                    if '\\' in t:
                        self.log.warning("%s contains a literal backslash '\\'", f)

                elif tokname == 'error':
                    self.log.error("Rule '%s' must be defined as a function", f)
                    self.error = True
                else:
                    for s in states:
                        self.strsym[s].append((f, t))
            else:
                self.log.error('%s not defined as a function or string', f)
                self.error = True

        # Sort the functions by line number
        for f in self.funcsym.values():
            f.sort(key=lambda x: x[1].__code__.co_firstlineno)

        # Sort the strings by regular expression length
        for s in self.strsym.values():
            s.sort(key=lambda x: len(x[1]), reverse=True)

    # Validate all of the t_rules collected
    def validate_rules(self):
        for state in self.stateinfo:
            # Validate all rules defined by functions

            for fname, f in self.funcsym[state]:
                line = f.__code__.co_firstlineno
                file = f.__code__.co_filename
                module = inspect.getmodule(f)
                self.modules.add(module)

                tokname = self.toknames[fname]
                if isinstance(f, types.MethodType):
                    reqargs = 2
                else:
                    reqargs = 1
                nargs = f.__code__.co_argcount
                if nargs > reqargs:
                    self.log.error("%s:%d: Rule '%s' has too many arguments", file, line, f.__name__)
                    self.error = True
                    continue

                if nargs < reqargs:
                    self.log.error("%s:%d: Rule '%s' requires an argument", file, line, f.__name__)
                    self.error = True
                    continue

                if not _get_regex(f):
                    self.log.error("%s:%d: No regular expression defined for rule '%s'", file, line, f.__name__)
                    self.error = True
                    continue

                try:
                    c = re.compile('(?P<%s>%s)' % (fname, _get_regex(f)), self.reflags)
                    if c.match(''):
                        self.log.error("%s:%d: Regular expression for rule '%s' matches empty string", file, line, f.__name__)
                        self.error = True
                except re.error as e:
                    self.log.error("%s:%d: Invalid regular expression for rule '%s'. %s", file, line, f.__name__, e)
                    if '#' in _get_regex(f):
                        self.log.error("%s:%d. Make sure '#' in rule '%s' is escaped with '\\#'", file, line, f.__name__)
                    self.error = True

            # Validate all rules defined by strings
            for name, r in self.strsym[state]:
                tokname = self.toknames[name]
                if tokname == 'error':
                    self.log.error("Rule '%s' must be defined as a function", name)
                    self.error = True
                    continue

                if tokname not in self.tokens and tokname.find('ignore_') < 0:
                    self.log.error("Rule '%s' defined for an unspecified token %s", name, tokname)
                    self.error = True
                    continue

                try:
                    c = re.compile('(?P<%s>%s)' % (name, r), self.reflags)
                    if (c.match('')):
                        self.log.error("Regular expression for rule '%s' matches empty string", name)
                        self.error = True
                except re.error as e:
                    self.log.error("Invalid regular expression for rule '%s'. %s", name, e)
                    if '#' in r:
                        self.log.error("Make sure '#' in rule '%s' is escaped with '\\#'", name)
                    self.error = True

            if not self.funcsym[state] and not self.strsym[state]:
                self.log.error("No rules defined for state '%s'", state)
                self.error = True

            # Validate the error function
            efunc = self.errorf.get(state, None)
            if efunc:
                f = efunc
                line = f.__code__.co_firstlineno
                file = f.__code__.co_filename
                module = inspect.getmodule(f)
                self.modules.add(module)

                if isinstance(f, types.MethodType):
                    reqargs = 2
                else:
                    reqargs = 1
                nargs = f.__code__.co_argcount
                if nargs > reqargs:
                    self.log.error("%s:%d: Rule '%s' has too many arguments", file, line, f.__name__)
                    self.error = True

                if nargs < reqargs:
                    self.log.error("%s:%d: Rule '%s' requires an argument", file, line, f.__name__)
                    self.error = True

        for module in self.modules:
            self.validate_module(module)

    # -----------------------------------------------------------------------------
    # validate_module()
    #
    # This checks to see if there are duplicated t_rulename() functions or strings
    # in the parser input file.  This is done using a simple regular expression
    # match on each line in the source code of the given module.
    # -----------------------------------------------------------------------------

    def validate_module(self, module):
        try:
            lines, linen = inspect.getsourcelines(module)
        except IOError:
            return

        fre = re.compile(r'\s*def\s+(t_[a-zA-Z_0-9]*)\(')
        sre = re.compile(r'\s*(t_[a-zA-Z_0-9]*)\s*=')

        counthash = {}
        linen += 1
        for line in lines:
            m = fre.match(line)
            if not m:
                m = sre.match(line)
            if m:
                name = m.group(1)
                prev = counthash.get(name)
                if not prev:
                    counthash[name] = linen
                else:
                    filename = inspect.getsourcefile(module)
                    self.log.error('%s:%d: Rule %s redefined. Previously defined on line %d', filename, linen, name, prev)
                    self.error = True
            linen += 1

# -----------------------------------------------------------------------------
# lex(module)
#
# Build all of the regular expression rules from definitions in the supplied module
# -----------------------------------------------------------------------------
def lex(module=None, object=None, debug=False, optimize=False, lextab='lextab',
        reflags=int(re.VERBOSE), nowarn=False, outputdir=None, debuglog=None, errorlog=None, cls=Lexer):  # <mod NL>

    if lextab is None:
        lextab = 'lextab'

    global lexer

    ldict = None
    stateinfo  = {'INITIAL': 'inclusive'}
    lexobj = cls()  # <mod NL>
    lexobj.lexoptimize = optimize
    global token, input

    if errorlog is None:
        errorlog = PlyLogger(sys.stderr)

    if debug:
        if debuglog is None:
            debuglog = PlyLogger(sys.stderr)

    # Get the module dictionary used for the lexer
    if object:
        module = object

    # Get the module dictionary used for the parser
    if module:
        _items = [(k, getattr(module, k)) for k in dir(module)]
        ldict = dict(_items)
        # If no __file__ attribute is available, try to obtain it from the __module__ instead
        if '__file__' not in ldict:
            ldict['__file__'] = sys.modules[ldict['__module__']].__file__
    else:
        ldict = get_caller_module_dict(2)

    # Determine if the module is package of a package or not.
    # If so, fix the tabmodule setting so that tables load correctly
    pkg = ldict.get('__package__')
    if pkg and isinstance(lextab, str):
        if '.' not in lextab:
            lextab = pkg + '.' + lextab

    # Collect parser information from the dictionary
    linfo = LexerReflect(ldict, log=errorlog, reflags=reflags)
    linfo.get_all()
    if not optimize:
        if linfo.validate_all():
            raise SyntaxError("Can't build lexer")

    if optimize and lextab:
        try:
            lexobj.readtab(lextab, ldict)
            token = lexobj.token
            input = lexobj.input
            lexer = lexobj
            return lexobj

        except ImportError:
            pass

    # Dump some basic debugging information
    if debug:
        debuglog.info('lex: tokens   = %r', linfo.tokens)
        debuglog.info('lex: literals = %r', linfo.literals)
        debuglog.info('lex: states   = %r', linfo.stateinfo)

    # Build a dictionary of valid token names
    lexobj.lextokens = set()
    for n in linfo.tokens:
        lexobj.lextokens.add(n)

    # Get literals specification
    if isinstance(linfo.literals, (list, tuple)):
        lexobj.lexliterals = type(linfo.literals[0])().join(linfo.literals)
    else:
        lexobj.lexliterals = linfo.literals

    lexobj.lextokens_all = lexobj.lextokens | set(lexobj.lexliterals)

    # Get the stateinfo dictionary
    stateinfo = linfo.stateinfo

    regexs = {}
    # Build the master regular expressions
    for state in stateinfo:
        regex_list = []

        # Add rules defined by functions first
        for fname, f in linfo.funcsym[state]:
            regex_list.append('(?P<%s>%s)' % (fname, _get_regex(f)))
            if debug:
                debuglog.info("lex: Adding rule %s -> '%s' (state '%s')", fname, _get_regex(f), state)

        # Now add all of the simple rules
        for name, r in linfo.strsym[state]:
            regex_list.append('(?P<%s>%s)' % (name, r))
            if debug:
                debuglog.info("lex: Adding rule %s -> '%s' (state '%s')", name, r, state)

        regexs[state] = regex_list

    # Build the master regular expressions

    if debug:
        debuglog.info('lex: ==== MASTER REGEXS FOLLOW ====')

    for state in regexs:
        lexre, re_text, re_names = _form_master_re(regexs[state], reflags, ldict, linfo.toknames)
        lexobj.lexstatere[state] = lexre
        lexobj.lexstateretext[state] = re_text
        lexobj.lexstaterenames[state] = re_names
        if debug:
            for i, text in enumerate(re_text):
                debuglog.info("lex: state '%s' : regex[%d] = '%s'", state, i, text)

    # For inclusive states, we need to add the regular expressions from the INITIAL state
    for state, stype in stateinfo.items():
        if state != 'INITIAL' and stype == 'inclusive':
            lexobj.lexstatere[state].extend(lexobj.lexstatere['INITIAL'])
            lexobj.lexstateretext[state].extend(lexobj.lexstateretext['INITIAL'])
            lexobj.lexstaterenames[state].extend(lexobj.lexstaterenames['INITIAL'])

    lexobj.lexstateinfo = stateinfo
    lexobj.lexre = lexobj.lexstatere['INITIAL']
    lexobj.lexretext = lexobj.lexstateretext['INITIAL']
    lexobj.lexreflags = reflags

    # Set up ignore variables
    lexobj.lexstateignore = linfo.ignore
    lexobj.lexignore = lexobj.lexstateignore.get('INITIAL', '')

    # Set up error functions
    lexobj.lexstateerrorf = linfo.errorf
    lexobj.lexerrorf = linfo.errorf.get('INITIAL', None)
    if not lexobj.lexerrorf:
        errorlog.warning('No t_error rule is defined')

    # Set up eof functions
    lexobj.lexstateeoff = linfo.eoff
    lexobj.lexeoff = linfo.eoff.get('INITIAL', None)

    # Check state information for ignore and error rules
    for s, stype in stateinfo.items():
        if stype == 'exclusive':
            if s not in linfo.errorf:
                errorlog.warning("No error rule is defined for exclusive state '%s'", s)
            if s not in linfo.ignore and lexobj.lexignore:
                errorlog.warning("No ignore rule is defined for exclusive state '%s'", s)
        elif stype == 'inclusive':
            if s not in linfo.errorf:
                linfo.errorf[s] = linfo.errorf.get('INITIAL', None)
            if s not in linfo.ignore:
                linfo.ignore[s] = linfo.ignore.get('INITIAL', '')

    # Create global versions of the token() and input() functions
    token = lexobj.token
    input = lexobj.input
    lexer = lexobj

    # If in optimize mode, we write the lextab
    if lextab and optimize:
        if outputdir is None:
            # If no output directory is set, the location of the output files
            # is determined according to the following rules:
            #     - If lextab specifies a package, files go into that package directory
            #     - Otherwise, files go in the same directory as the specifying module
            if isinstance(lextab, types.ModuleType):
                srcfile = lextab.__file__
            else:
                if '.' not in lextab:
                    srcfile = ldict['__file__']
                else:
                    parts = lextab.split('.')
                    pkgname = '.'.join(parts[:-1])
                    exec('import %s' % pkgname)
                    srcfile = getattr(sys.modules[pkgname], '__file__', '')
            outputdir = os.path.dirname(srcfile)
        try:
            lexobj.writetab(lextab, outputdir)
            if lextab in sys.modules:
                del sys.modules[lextab]
        except IOError as e:
            errorlog.warning("Couldn't write lextab module %r. %s" % (lextab, e))

    return lexobj

# -----------------------------------------------------------------------------
Download .txt
gitextract_dfid13aw/

├── .flake8
├── .github/
│   └── workflows/
│       ├── black.yml
│       ├── flake8.yml
│       ├── publish.yml
│       └── test.yml
├── .gitignore
├── .travis.yml
├── CHANGELOG.md
├── CONTRIBUTING
├── LICENSE
├── MANIFEST.in
├── README.md
├── ctypesgen/
│   ├── .gitignore
│   ├── __init__.py
│   ├── __main__.py
│   ├── ctypedescs.py
│   ├── descriptions.py
│   ├── expressions.py
│   ├── libraryloader.py
│   ├── messages.py
│   ├── options.py
│   ├── parser/
│   │   ├── .gitignore
│   │   ├── __init__.py
│   │   ├── cdeclarations.py
│   │   ├── cgrammar.py
│   │   ├── cparser.py
│   │   ├── ctypesparser.py
│   │   ├── datacollectingparser.py
│   │   ├── lex.py
│   │   ├── lextab.py
│   │   ├── parsetab.py
│   │   ├── pplexer.py
│   │   ├── preprocessor.py
│   │   └── yacc.py
│   ├── printer_json/
│   │   ├── __init__.py
│   │   └── printer.py
│   ├── printer_python/
│   │   ├── __init__.py
│   │   ├── defaultheader.py
│   │   ├── preamble.py
│   │   └── printer.py
│   ├── processor/
│   │   ├── __init__.py
│   │   ├── dependencies.py
│   │   ├── operations.py
│   │   └── pipeline.py
│   └── version.py
├── debian/
│   ├── .gitignore
│   ├── compat
│   ├── control
│   ├── ctypesgen.docs
│   ├── ctypesgen.manpages
│   ├── mk_changelog
│   ├── mk_manpage
│   └── rules
├── demo/
│   ├── .gitignore
│   ├── README.md
│   ├── demoapp.c
│   ├── demoapp.py
│   ├── demolib.c
│   ├── demolib.h
│   └── pydemolib.py
├── docs/
│   └── publishing.md
├── pyproject.toml
├── run.py
├── setup.py
├── tests/
│   ├── .gitignore
│   ├── __init__.py
│   ├── ctypesgentest.py
│   └── testsuite.py
├── todo.txt
└── tox.ini
Download .txt
SYMBOL INDEX (990 symbols across 30 files)

FILE: ctypesgen/__main__.py
  function find_names_in_modules (line 18) | def find_names_in_modules(modules):
  function main (line 30) | def main(givenargs=None):

FILE: ctypesgen/ctypedescs.py
  class CtypesTypeVisitor (line 79) | class CtypesTypeVisitor(object):
    method visit_struct (line 80) | def visit_struct(self, struct):
    method visit_enum (line 83) | def visit_enum(self, enum):
    method visit_typedef (line 86) | def visit_typedef(self, name):
    method visit_error (line 89) | def visit_error(self, error, cls):
    method visit_identifier (line 92) | def visit_identifier(self, identifier):
  function visit_type_and_collect_info (line 98) | def visit_type_and_collect_info(ctype):
  function remove_function_pointer (line 127) | def remove_function_pointer(t):
  class CtypesType (line 137) | class CtypesType(object):
    method __init__ (line 138) | def __init__(self):
    method __repr__ (line 142) | def __repr__(self):
    method error (line 145) | def error(self, message, cls=None):
    method visit (line 148) | def visit(self, visitor):
  class CtypesSimple (line 153) | class CtypesSimple(CtypesType):
    method __init__ (line 156) | def __init__(self, name, signed, longs):
    method py_string (line 162) | def py_string(self, ignore_can_be_ctype=None):
  class CtypesSpecial (line 166) | class CtypesSpecial(CtypesType):
    method __init__ (line 167) | def __init__(self, name):
    method py_string (line 171) | def py_string(self, ignore_can_be_ctype=None):
  class CtypesTypedef (line 175) | class CtypesTypedef(CtypesType):
    method __init__ (line 178) | def __init__(self, name):
    method visit (line 182) | def visit(self, visitor):
    method py_string (line 187) | def py_string(self, ignore_can_be_ctype=None):
  class CtypesBitfield (line 191) | class CtypesBitfield(CtypesType):
    method __init__ (line 192) | def __init__(self, base, bitfield):
    method visit (line 197) | def visit(self, visitor):
    method py_string (line 201) | def py_string(self, ignore_can_be_ctype=None):
  class CtypesPointer (line 205) | class CtypesPointer(CtypesType):
    method __init__ (line 206) | def __init__(self, destination, qualifiers):
    method visit (line 211) | def visit(self, visitor):
    method py_string (line 216) | def py_string(self, ignore_can_be_ctype=None):
  class CtypesArray (line 220) | class CtypesArray(CtypesType):
    method __init__ (line 221) | def __init__(self, base, count):
    method visit (line 226) | def visit(self, visitor):
    method py_string (line 232) | def py_string(self, ignore_can_be_ctype=None):
  class CtypesNoErrorCheck (line 241) | class CtypesNoErrorCheck(object):
    method py_string (line 242) | def py_string(self, ignore_can_be_ctype=None):
    method __bool__ (line 245) | def __bool__(self):
  class CtypesPointerCast (line 251) | class CtypesPointerCast(object):
    method __init__ (line 252) | def __init__(self, target):
    method py_string (line 255) | def py_string(self, ignore_can_be_ctype=None):
  class CtypesFunction (line 259) | class CtypesFunction(CtypesType):
    method __init__ (line 260) | def __init__(self, restype, parameters, variadic, attrib=dict()):
    method visit (line 289) | def visit(self, visitor):
    method py_string (line 295) | def py_string(self, ignore_can_be_ctype=None):
  function anonymous_struct_tagnum (line 305) | def anonymous_struct_tagnum():
  function fmt_anonymous_struct_tag (line 311) | def fmt_anonymous_struct_tag(num):
  function anonymous_struct_tag (line 315) | def anonymous_struct_tag():
  class CtypesStruct (line 319) | class CtypesStruct(CtypesType):
    method __init__ (line 320) | def __init__(self, tag, attrib, variety, members, src=None):
    method get_required_types (line 343) | def get_required_types(self):
    method visit (line 348) | def visit(self, visitor):
    method get_subtypes (line 355) | def get_subtypes(self):
    method py_string (line 361) | def py_string(self, ignore_can_be_ctype=None):
  function anonymous_enum_tag (line 368) | def anonymous_enum_tag():
  class CtypesEnum (line 374) | class CtypesEnum(CtypesType):
    method __init__ (line 375) | def __init__(self, tag, enumerators, src=None):
    method visit (line 393) | def visit(self, visitor):
    method py_string (line 397) | def py_string(self, ignore_can_be_ctype=None):

FILE: ctypesgen/descriptions.py
  class DescriptionCollection (line 10) | class DescriptionCollection(object):
    method __init__ (line 13) | def __init__(
  class Description (line 27) | class Description(object):
    method __init__ (line 31) | def __init__(self, src=None):
    method add_requirements (line 57) | def add_requirements(self, reqs):
    method error (line 62) | def error(self, msg, cls=None):
    method warning (line 65) | def warning(self, msg, cls=None):
    method __repr__ (line 68) | def __repr__(self):
    method casual_name (line 71) | def casual_name(self):
    method py_name (line 74) | def py_name(self):
    method c_name (line 77) | def c_name(self):
  class ConstantDescription (line 81) | class ConstantDescription(Description):
    method __init__ (line 84) | def __init__(self, name, value, src=None):
    method casual_name (line 91) | def casual_name(self):
    method py_name (line 94) | def py_name(self):
    method c_name (line 97) | def c_name(self):
  class TypedefDescription (line 101) | class TypedefDescription(Description):
    method __init__ (line 104) | def __init__(self, name, ctype, src=None):
    method casual_name (line 109) | def casual_name(self):
    method py_name (line 112) | def py_name(self):
    method c_name (line 115) | def c_name(self):
  class StructDescription (line 119) | class StructDescription(Description):
    method __init__ (line 122) | def __init__(self, tag, attrib, variety, members, opaque, ctype, src=N...
    method casual_name (line 136) | def casual_name(self):
    method py_name (line 139) | def py_name(self):
    method c_name (line 142) | def c_name(self):
  class EnumDescription (line 146) | class EnumDescription(Description):
    method __init__ (line 149) | def __init__(self, tag, members, ctype, src=None):
    method casual_name (line 158) | def casual_name(self):
    method py_name (line 161) | def py_name(self):
    method c_name (line 164) | def c_name(self):
  class FunctionDescription (line 168) | class FunctionDescription(Description):
    method __init__ (line 171) | def __init__(self, name, restype, argtypes, errcheck, variadic, attrib...
    method casual_name (line 188) | def casual_name(self):
    method py_name (line 191) | def py_name(self):
    method c_name (line 194) | def c_name(self):
  class VariableDescription (line 198) | class VariableDescription(Description):
    method __init__ (line 201) | def __init__(self, name, ctype, src=None):
    method casual_name (line 210) | def casual_name(self):
    method py_name (line 213) | def py_name(self):
    method c_name (line 216) | def c_name(self):
  class MacroDescription (line 220) | class MacroDescription(Description):
    method __init__ (line 223) | def __init__(self, name, params, expr, src=None):
    method casual_name (line 229) | def casual_name(self):
    method py_name (line 232) | def py_name(self):
    method c_name (line 235) | def c_name(self):
  class UndefDescription (line 239) | class UndefDescription(Description):
    method __init__ (line 242) | def __init__(self, macro, src=None):
    method casual_name (line 248) | def casual_name(self):
    method py_name (line 251) | def py_name(self):
    method c_name (line 254) | def c_name(self):

FILE: ctypesgen/expressions.py
  class EvaluationContext (line 32) | class EvaluationContext(object):
    method evaluate_identifier (line 35) | def evaluate_identifier(self, name):
    method evaluate_sizeof (line 39) | def evaluate_sizeof(self, object):
    method evaluate_parameter (line 43) | def evaluate_parameter(self, name):
  class ExpressionNode (line 48) | class ExpressionNode(object):
    method __init__ (line 49) | def __init__(self):
    method error (line 52) | def error(self, message, cls=None):
    method __repr__ (line 55) | def __repr__(self):
    method visit (line 62) | def visit(self, visitor):
  class ConstantExpressionNode (line 67) | class ConstantExpressionNode(ExpressionNode):
    method __init__ (line 68) | def __init__(self, value, is_literal=False):
    method evaluate (line 73) | def evaluate(self, context):
    method py_string (line 76) | def py_string(self, can_be_ctype):
  class IdentifierExpressionNode (line 86) | class IdentifierExpressionNode(ExpressionNode):
    method __init__ (line 87) | def __init__(self, name):
    method evaluate (line 91) | def evaluate(self, context):
    method visit (line 94) | def visit(self, visitor):
    method py_string (line 98) | def py_string(self, can_be_ctype):
  class ParameterExpressionNode (line 104) | class ParameterExpressionNode(ExpressionNode):
    method __init__ (line 105) | def __init__(self, name):
    method evaluate (line 109) | def evaluate(self, context):
    method visit (line 112) | def visit(self, visitor):
    method py_string (line 115) | def py_string(self, can_be_ctype):
  class UnaryExpressionNode (line 121) | class UnaryExpressionNode(ExpressionNode):
    method __init__ (line 122) | def __init__(self, name, op, format, child_can_be_ctype, child):
    method visit (line 130) | def visit(self, visitor):
    method evaluate (line 134) | def evaluate(self, context):
    method py_string (line 140) | def py_string(self, can_be_ctype):
  class SizeOfExpressionNode (line 144) | class SizeOfExpressionNode(ExpressionNode):
    method __init__ (line 145) | def __init__(self, child):
    method visit (line 149) | def visit(self, visitor):
    method evaluate (line 153) | def evaluate(self, context):
    method py_string (line 159) | def py_string(self, can_be_ctype):
  class BinaryExpressionNode (line 166) | class BinaryExpressionNode(ExpressionNode):
    method __init__ (line 167) | def __init__(self, name, op, format, can_be_ctype, left, right):
    method visit (line 176) | def visit(self, visitor):
    method evaluate (line 181) | def evaluate(self, context):
    method py_string (line 187) | def py_string(self, can_be_ctype):
  class ConditionalExpressionNode (line 194) | class ConditionalExpressionNode(ExpressionNode):
    method __init__ (line 195) | def __init__(self, cond, yes, no):
    method visit (line 201) | def visit(self, visitor):
    method evaluate (line 207) | def evaluate(self, context):
    method py_string (line 213) | def py_string(self, can_be_ctype):
  class AttributeExpressionNode (line 221) | class AttributeExpressionNode(ExpressionNode):
    method __init__ (line 222) | def __init__(self, op, format, base, attribute):
    method visit (line 235) | def visit(self, visitor):
    method evaluate (line 239) | def evaluate(self, context):
    method py_string (line 242) | def py_string(self, can_be_ctype):
  class CallExpressionNode (line 251) | class CallExpressionNode(ExpressionNode):
    method __init__ (line 252) | def __init__(self, function, arguments):
    method visit (line 257) | def visit(self, visitor):
    method evaluate (line 263) | def evaluate(self, context):
    method py_string (line 267) | def py_string(self, can_be_ctype):
  class TypeCastExpressionNode (line 273) | class TypeCastExpressionNode(ExpressionNode):
    method __init__ (line 279) | def __init__(self, base, ctype):
    method visit (line 284) | def visit(self, visitor):
    method evaluate (line 289) | def evaluate(self, context):
    method py_string (line 292) | def py_string(self, can_be_ctype):
  class UnsupportedExpressionNode (line 332) | class UnsupportedExpressionNode(ExpressionNode):
    method __init__ (line 333) | def __init__(self, message):
    method evaluate (line 338) | def evaluate(self, context):
    method __repr__ (line 341) | def __repr__(self):
    method py_string (line 344) | def py_string(self, can_be_ctype):

FILE: ctypesgen/libraryloader.py
  function _environ_path (line 47) | def _environ_path(name):
  class LibraryLoader (line 54) | class LibraryLoader:
    class Lookup (line 63) | class Lookup:
      method __init__ (line 68) | def __init__(self, path):
      method get (line 72) | def get(self, name, calling_convention="cdecl"):
      method has (line 82) | def has(self, name, calling_convention="cdecl"):
      method __getattr__ (line 88) | def __getattr__(self, name):
    method __init__ (line 91) | def __init__(self):
    method __call__ (line 94) | def __call__(self, libname):
    method getpaths (line 107) | def getpaths(self, libname):
    method getplatformpaths (line 145) | def getplatformpaths(self, _libname):  # pylint: disable=no-self-use
  class DarwinLibraryLoader (line 153) | class DarwinLibraryLoader(LibraryLoader):
    class Lookup (line 166) | class Lookup(LibraryLoader.Lookup):
    method getplatformpaths (line 177) | def getplatformpaths(self, libname):
    method getdirs (line 188) | def getdirs(libname):
  class PosixLibraryLoader (line 227) | class PosixLibraryLoader(LibraryLoader):
    class _Directories (line 236) | class _Directories(dict):
      method __init__ (line 239) | def __init__(self):
      method add (line 243) | def add(self, directory):
      method extend (line 254) | def extend(self, directories):
      method ordered (line 259) | def ordered(self):
    method _get_ld_so_conf_dirs (line 263) | def _get_ld_so_conf_dirs(self, conf, dirs):
    method _create_ld_so_cache (line 285) | def _create_ld_so_cache(self):
    method getplatformpaths (line 355) | def getplatformpaths(self, libname):
  class WindowsLibraryLoader (line 370) | class WindowsLibraryLoader(LibraryLoader):
    class Lookup (line 375) | class Lookup(LibraryLoader.Lookup):
      method __init__ (line 378) | def __init__(self, path):
  function add_library_search_dirs (line 398) | def add_library_search_dirs(other_dirs):

FILE: ctypesgen/messages.py
  function error_message (line 34) | def error_message(msg, cls=None):
  function warning_message (line 38) | def warning_message(msg, cls=None):
  function status_message (line 42) | def status_message(msg):

FILE: ctypesgen/options.py
  function get_default_options (line 48) | def get_default_options():

FILE: ctypesgen/parser/__init__.py
  function parse (line 18) | def parse(headers, options):

FILE: ctypesgen/parser/cdeclarations.py
  class Declaration (line 14) | class Declaration(object):
    method __init__ (line 15) | def __init__(self):
    method __repr__ (line 21) | def __repr__(self):
  class Declarator (line 29) | class Declarator(object):
    method __init__ (line 32) | def __init__(self):
    method __repr__ (line 43) | def __repr__(self):
  class Pointer (line 56) | class Pointer(Declarator):
    method __init__ (line 59) | def __init__(self):
    method __repr__ (line 63) | def __repr__(self):
  class Array (line 70) | class Array(object):
    method __init__ (line 71) | def __init__(self):
    method __repr__ (line 75) | def __repr__(self):
  class Parameter (line 86) | class Parameter(object):
    method __init__ (line 87) | def __init__(self):
    method __repr__ (line 93) | def __repr__(self):
  class Type (line 103) | class Type(object):
    method __init__ (line 104) | def __init__(self):
    method __repr__ (line 108) | def __repr__(self):
  class StorageClassSpecifier (line 115) | class StorageClassSpecifier(str):
    method __repr__ (line 116) | def __repr__(self):
  class TypeSpecifier (line 120) | class TypeSpecifier(str):
    method __repr__ (line 121) | def __repr__(self):
  class StructTypeSpecifier (line 125) | class StructTypeSpecifier(object):
    method __init__ (line 126) | def __init__(self, is_union, attrib, tag, declarations):
    method __repr__ (line 134) | def __repr__(self):
  class EnumSpecifier (line 155) | class EnumSpecifier(object):
    method __init__ (line 156) | def __init__(self, tag, enumerators, src=None):
    method __repr__ (line 162) | def __repr__(self):
  class Enumerator (line 171) | class Enumerator(object):
    method __init__ (line 172) | def __init__(self, name, expression):
    method __repr__ (line 176) | def __repr__(self):
  class TypeQualifier (line 183) | class TypeQualifier(str):
    method __repr__ (line 184) | def __repr__(self):
  class PragmaPack (line 188) | class PragmaPack(object):
    method __init__ (line 191) | def __init__(self):
    method set_default (line 195) | def set_default(self):
    method push (line 198) | def push(self, id=None, value=None):
    method pop (line 205) | def pop(self, id=None):
  class Attrib (line 242) | class Attrib(dict):
    method __init__ (line 243) | def __init__(self, *a, **kw):
    method __repr__ (line 251) | def __repr__(self):
    method update (line 254) | def update(self, *a, **kw):
    method _unalias (line 258) | def _unalias(self):
  function apply_specifiers (line 274) | def apply_specifiers(specifiers, declaration):

FILE: ctypesgen/parser/cgrammar.py
  function p_translation_unit (line 123) | def p_translation_unit(p):
  function p_identifier (line 134) | def p_identifier(p):
  function p_constant_integer (line 156) | def p_constant_integer(p):
  function p_constant_float (line 172) | def p_constant_float(p):
  function p_constant_character (line 183) | def p_constant_character(p):
  function p_string_literal (line 191) | def p_string_literal(p):
  function p_multi_string_literal (line 197) | def p_multi_string_literal(p):
  function p_macro_param (line 211) | def p_macro_param(p):
  function p_primary_expression (line 221) | def p_primary_expression(p):
  function p_postfix_expression (line 233) | def p_postfix_expression(p):
  function p_argument_expression_list (line 279) | def p_argument_expression_list(p):
  function p_asm_expression (line 292) | def p_asm_expression(p):
  function p_str_opt_expr_pair_list (line 308) | def p_str_opt_expr_pair_list(p):
  function p_str_opt_expr_pair (line 315) | def p_str_opt_expr_pair(p):
  function p_volatile_opt (line 321) | def p_volatile_opt(p):
  function p_unary_expression (line 339) | def p_unary_expression(p):
  function p_unary_operator (line 362) | def p_unary_operator(p):
  function p_cast_expression (line 373) | def p_cast_expression(p):
  function p_multiplicative_expression (line 390) | def p_multiplicative_expression(p):
  function p_additive_expression (line 409) | def p_additive_expression(p):
  function p_shift_expression (line 427) | def p_shift_expression(p):
  function p_relational_expression (line 447) | def p_relational_expression(p):
  function p_equality_expression (line 467) | def p_equality_expression(p):
  function p_and_expression (line 479) | def p_and_expression(p):
  function p_exclusive_or_expression (line 491) | def p_exclusive_or_expression(p):
  function p_inclusive_or_expression (line 503) | def p_inclusive_or_expression(p):
  function p_logical_and_expression (line 515) | def p_logical_and_expression(p):
  function p_logical_or_expression (line 527) | def p_logical_or_expression(p):
  function p_conditional_expression (line 539) | def p_conditional_expression(p):
  function p_assignment_expression (line 563) | def p_assignment_expression(p):
  function p_assignment_operator (line 579) | def p_assignment_operator(p):
  function p_expression (line 595) | def p_expression(p):
  function p_constant_expression (line 603) | def p_constant_expression(p):
  function p_declaration (line 609) | def p_declaration(p):
  function p_declaration_impl (line 617) | def p_declaration_impl(p):
  function p_declaration_specifier_list (line 637) | def p_declaration_specifier_list(p):
  function p_declaration_specifier (line 651) | def p_declaration_specifier(p):
  function p_init_declarator_list (line 660) | def p_init_declarator_list(p):
  function p_init_declarator (line 670) | def p_init_declarator(p):
  function p_storage_class_specifier (line 682) | def p_storage_class_specifier(p):
  function p_type_specifier (line 692) | def p_type_specifier(p):
  function p_struct_or_union_specifier (line 713) | def p_struct_or_union_specifier(p):
  function p_struct_or_union (line 742) | def p_struct_or_union(p):
  function p_gcc_attributes (line 749) | def p_gcc_attributes(p):
  function p_gcc_attribute (line 761) | def p_gcc_attribute(p):
  function p_gcc_attrib_list (line 768) | def p_gcc_attrib_list(p):
  function p_gcc_attrib (line 778) | def p_gcc_attrib(p):
  function p_member_declaration_list (line 793) | def p_member_declaration_list(p):
  function p_member_declaration (line 803) | def p_member_declaration(p):
  function p_specifier_qualifier_list (line 825) | def p_specifier_qualifier_list(p):
  function p_specifier_qualifier (line 835) | def p_specifier_qualifier(p):
  function p_member_declarator_list (line 842) | def p_member_declarator_list(p):
  function p_member_declarator (line 852) | def p_member_declarator(p):
  function p_enum_specifier (line 869) | def p_enum_specifier(p):
  function p_enumerator_list (line 885) | def p_enumerator_list(p):
  function p_enumerator_list_iso (line 894) | def p_enumerator_list_iso(p):
  function p_enumerator (line 904) | def p_enumerator(p):
  function p_type_qualifier (line 914) | def p_type_qualifier(p):
  function p_function_specifier (line 922) | def p_function_specifier(p):
  function p_declarator (line 928) | def p_declarator(p):
  function p_direct_declarator (line 943) | def p_direct_declarator(p):
  function p_pointer (line 979) | def p_pointer(p):
  function p_type_qualifier_list (line 1004) | def p_type_qualifier_list(p):
  function p_parameter_type_list (line 1016) | def p_parameter_type_list(p):
  function p_parameter_list (line 1026) | def p_parameter_list(p):
  function p_parameter_declaration (line 1036) | def p_parameter_declaration(p):
  function p_identifier_list (line 1054) | def p_identifier_list(p):
  function p_type_name (line 1068) | def p_type_name(p):
  function p_abstract_declarator (line 1085) | def p_abstract_declarator(p):
  function p_direct_abstract_declarator (line 1114) | def p_direct_abstract_declarator(p):
  function p_initializer (line 1161) | def p_initializer(p):
  function p_initializer_list (line 1168) | def p_initializer_list(p):
  function p_statement (line 1174) | def p_statement(p):
  function p_labeled_statement (line 1184) | def p_labeled_statement(p):
  function p_compound_statement (line 1191) | def p_compound_statement(p):
  function p_compound_statement_error (line 1199) | def p_compound_statement_error(p):
  function p_declaration_list (line 1205) | def p_declaration_list(p):
  function p_statement_list (line 1211) | def p_statement_list(p):
  function p_expression_statement (line 1217) | def p_expression_statement(p):
  function p_expression_statement_error (line 1223) | def p_expression_statement_error(p):
  function p_selection_statement (line 1229) | def p_selection_statement(p):
  function p_iteration_statement (line 1236) | def p_iteration_statement(p):
  function p_jump_statement (line 1244) | def p_jump_statement(p):
  function p_external_declaration (line 1253) | def p_external_declaration(p):
  function p_function_definition (line 1260) | def p_function_definition(p):
  function p_directive (line 1269) | def p_directive(p):
  function p_define (line 1276) | def p_define(p):
  function p_define_error (line 1313) | def p_define_error(p):
  function p_undefine (line 1341) | def p_undefine(p):
  function p_macro_parameter_list (line 1352) | def p_macro_parameter_list(p):
  function p_error (line 1363) | def p_error(t):
  function p_pragma (line 1377) | def p_pragma(p):
  function p_pragma_pack (line 1383) | def p_pragma_pack(p):
  function p_pragma_pack_stack_args (line 1409) | def p_pragma_pack_stack_args(p):
  function p_pragma_directive_list (line 1433) | def p_pragma_directive_list(p):
  function p_pragma_directive (line 1443) | def p_pragma_directive(p):
  function main (line 1450) | def main():

FILE: ctypesgen/parser/cparser.py
  class CLexer (line 20) | class CLexer(object):
    method __init__ (line 21) | def __init__(self, cparser):
    method input (line 28) | def input(self, tokens):
    method token (line 32) | def token(self):
  class CParser (line 80) | class CParser(object):
    method __init__ (line 87) | def __init__(self, options):
    method parse (line 112) | def parse(self, filename, debug=False):
    method handle_error (line 128) | def handle_error(self, message, filename, lineno):
    method handle_pp_error (line 137) | def handle_pp_error(self, message):
    method handle_status (line 145) | def handle_status(self, message):
    method handle_define (line 152) | def handle_define(self, name, params, value, filename, lineno):
    method handle_define_constant (line 161) | def handle_define_constant(self, name, value, filename, lineno):
    method handle_define_macro (line 168) | def handle_define_macro(self, name, params, value, filename, lineno):
    method handle_undefine (line 176) | def handle_undefine(self, name, filename, lineno):
    method impl_handle_declaration (line 182) | def impl_handle_declaration(self, declaration, filename, lineno):
    method handle_declaration (line 197) | def handle_declaration(self, declaration, filename, lineno):
  class DebugCParser (line 206) | class DebugCParser(CParser):
    method handle_define (line 211) | def handle_define(self, name, value, filename, lineno):
    method handle_define_constant (line 214) | def handle_define_constant(self, name, value, filename, lineno):
    method handle_declaration (line 217) | def handle_declaration(self, declaration, filename, lineno):
    method get_ctypes_type (line 220) | def get_ctypes_type(self, typ, declarator):
    method handle_define_unparseable (line 223) | def handle_define_unparseable(self, name, params, value, filename, lin...

FILE: ctypesgen/parser/ctypesparser.py
  function make_enum_from_specifier (line 41) | def make_enum_from_specifier(specifier):
  function get_decl_id (line 68) | def get_decl_id(decl):
  class CtypesParser (line 78) | class CtypesParser(CParser):
    method __init__ (line 84) | def __init__(self, options):
    method make_struct_from_specifier (line 90) | def make_struct_from_specifier(self, specifier):
    method get_ctypes_type (line 116) | def get_ctypes_type(self, typ, declarator, check_qualifiers=False):
    method handle_declaration (line 218) | def handle_declaration(self, declaration, filename, lineno):
    method handle_ctypes_new_type (line 244) | def handle_ctypes_new_type(self, ctype, filename, lineno):
    method handle_ctypes_typedef (line 247) | def handle_ctypes_typedef(self, name, ctype, filename, lineno):
    method handle_ctypes_function (line 250) | def handle_ctypes_function(
    method handle_ctypes_variable (line 255) | def handle_ctypes_variable(self, name, ctype, filename, lineno):

FILE: ctypesgen/parser/datacollectingparser.py
  class DataCollectingParser (line 28) | class DataCollectingParser(ctypesparser.CtypesParser, CtypesTypeVisitor):
    method __init__ (line 35) | def __init__(self, headers, options):
    method parse (line 70) | def parse(self):
    method handle_define_constant (line 86) | def handle_define_constant(self, name, expr, filename, lineno):
    method handle_define_unparseable (line 91) | def handle_define_unparseable(self, name, params, value, filename, lin...
    method handle_define_macro (line 104) | def handle_define_macro(self, name, params, expr, filename, lineno):
    method handle_undefine (line 109) | def handle_undefine(self, macro, filename, lineno):
    method handle_ctypes_typedef (line 113) | def handle_ctypes_typedef(self, name, ctype, filename, lineno):
    method handle_ctypes_new_type (line 123) | def handle_ctypes_new_type(self, ctype, filename, lineno):
    method handle_ctypes_function (line 130) | def handle_ctypes_function(
    method handle_ctypes_variable (line 146) | def handle_ctypes_variable(self, name, ctype, filename, lineno):
    method handle_struct (line 156) | def handle_struct(self, ctypestruct, filename, lineno):
    method handle_enum (line 220) | def handle_enum(self, ctypeenum, filename, lineno):
    method handle_macro (line 273) | def handle_macro(self, name, params, expr, filename, lineno):
    method handle_error (line 318) | def handle_error(self, message, filename, lineno):
    method handle_pp_error (line 322) | def handle_pp_error(self, message):
    method handle_status (line 326) | def handle_status(self, message):
    method visit_struct (line 330) | def visit_struct(self, struct):
    method visit_enum (line 333) | def visit_enum(self, enum):
    method data (line 336) | def data(self):

FILE: ctypesgen/parser/lex.py
  class LexError (line 57) | class LexError(Exception):
    method __init__ (line 58) | def __init__(self, message, s):
  class LexToken (line 64) | class LexToken(object):
    method __str__ (line 65) | def __str__(self):
    method __repr__ (line 68) | def __repr__(self):
  class PlyLogger (line 75) | class PlyLogger(object):
    method __init__ (line 76) | def __init__(self, f):
    method critical (line 79) | def critical(self, msg, *args, **kwargs):
    method warning (line 82) | def warning(self, msg, *args, **kwargs):
    method error (line 85) | def error(self, msg, *args, **kwargs):
  class NullLogger (line 93) | class NullLogger(object):
    method __getattribute__ (line 94) | def __getattribute__(self, name):
    method __call__ (line 97) | def __call__(self, *args, **kwargs):
  class Lexer (line 115) | class Lexer:
    method __init__ (line 116) | def __init__(self):
    method clone (line 144) | def clone(self, object=None):
    method writetab (line 174) | def writetab(self, lextab, outputdir=''):
    method readtab (line 211) | def readtab(self, tabfile, fdict):
    method input (line 251) | def input(self, s):
    method begin (line 263) | def begin(self, state):
    method push_state (line 276) | def push_state(self, state):
    method pop_state (line 283) | def pop_state(self):
    method current_state (line 289) | def current_state(self):
    method skip (line 295) | def skip(self, n):
    method token (line 305) | def token(self):
    method __iter__ (line 415) | def __iter__(self):
    method next (line 418) | def next(self):
  function _get_regex (line 439) | def _get_regex(func):
  function get_caller_module_dict (line 449) | def get_caller_module_dict(levels):
  function _funcs_to_names (line 462) | def _funcs_to_names(funclist, namelist):
  function _names_to_funcs (line 477) | def _names_to_funcs(namelist, fdict):
  function _form_master_re (line 493) | def _form_master_re(relist, reflags, ldict, toknames):
  function _statetoken (line 533) | def _statetoken(s, names):
  class LexerReflect (line 557) | class LexerReflect(object):
    method __init__ (line 558) | def __init__(self, ldict, log=None, reflags=0):
    method get_all (line 569) | def get_all(self):
    method validate_all (line 576) | def validate_all(self):
    method get_tokens (line 583) | def get_tokens(self):
    method validate_tokens (line 603) | def validate_tokens(self):
    method get_literals (line 614) | def get_literals(self):
    method validate_literals (line 620) | def validate_literals(self):
    method get_states (line 631) | def get_states(self):
    method get_rules (line 662) | def get_rules(self):
    method validate_rules (line 728) | def validate_rules(self):
    method validate_module (line 831) | def validate_module(self, module):
  function lex (line 862) | def lex(module=None, object=None, debug=False, optimize=False, lextab='l...
  function runmain (line 1054) | def runmain(lexer=None, data=None):
  function TOKEN (line 1088) | def TOKEN(r):

FILE: ctypesgen/parser/pplexer.py
  class StringLiteral (line 73) | class StringLiteral(str):
    method __new__ (line 74) | def __new__(cls, value):
  function t_ANY_directive (line 141) | def t_ANY_directive(t):
  function t_ANY_f_const_1 (line 149) | def t_ANY_f_const_1(t):
  function t_ANY_f_const_2 (line 159) | def t_ANY_f_const_2(t):
  function t_ANY_f_const_3 (line 169) | def t_ANY_f_const_3(t):
  function t_ANY_f_const_4 (line 179) | def t_ANY_f_const_4(t):
  function t_ANY_f_const_5 (line 187) | def t_ANY_f_const_5(t):
  function t_ANY_f_const_6 (line 195) | def t_ANY_f_const_6(t):
  function t_ANY_i_const_bin (line 203) | def t_ANY_i_const_bin(t):
  function t_ANY_i_const_hex (line 211) | def t_ANY_i_const_hex(t):
  function t_ANY_i_const_dec (line 219) | def t_ANY_i_const_dec(t):
  function t_ANY_i_const_oct (line 227) | def t_ANY_i_const_oct(t):
  function t_ANY_character_constant (line 239) | def t_ANY_character_constant(t):
  function t_ANY_string_literal (line 248) | def t_ANY_string_literal(t):
  function t_INITIAL_identifier (line 255) | def t_INITIAL_identifier(t):
  function t_DEFINE_identifier (line 261) | def t_DEFINE_identifier(t):
  function t_INITIAL_newline (line 292) | def t_INITIAL_newline(t):
  function t_INITIAL_pp_undefine (line 298) | def t_INITIAL_pp_undefine(t):
  function t_INITIAL_pp_define (line 307) | def t_INITIAL_pp_define(t):
  function t_INITIAL_pragma (line 316) | def t_INITIAL_pragma(t):
  function t_PRAGMA_pack (line 323) | def t_PRAGMA_pack(t):
  function t_PRAGMA_newline (line 329) | def t_PRAGMA_newline(t):
  function t_PRAGMA_identifier (line 337) | def t_PRAGMA_identifier(t):
  function t_PRAGMA_error (line 342) | def t_PRAGMA_error(t):
  function t_DEFINE_newline (line 350) | def t_DEFINE_newline(t):
  function t_DEFINE_pp_param_op (line 363) | def t_DEFINE_pp_param_op(t):
  function t_INITIAL_error (line 371) | def t_INITIAL_error(t):
  function t_DEFINE_error (line 376) | def t_DEFINE_error(t):

FILE: ctypesgen/parser/preprocessor.py
  class PreprocessorLexer (line 28) | class PreprocessorLexer(lex.Lexer):
    method __init__ (line 29) | def __init__(self):
    method input (line 34) | def input(self, data, filename=None):
    method token (line 41) | def token(self):
  class PreprocessorParser (line 57) | class PreprocessorParser(object):
    method __init__ (line 58) | def __init__(self, options, cparser):
    method parse (line 89) | def parse(self, filename):

FILE: ctypesgen/parser/yacc.py
  class PlyLogger (line 108) | class PlyLogger(object):
    method __init__ (line 109) | def __init__(self, f):
    method debug (line 112) | def debug(self, msg, *args, **kwargs):
    method warning (line 117) | def warning(self, msg, *args, **kwargs):
    method error (line 120) | def error(self, msg, *args, **kwargs):
  class NullLogger (line 126) | class NullLogger(object):
    method __getattribute__ (line 127) | def __getattribute__(self, name):
    method __call__ (line 130) | def __call__(self, *args, **kwargs):
  class YaccError (line 134) | class YaccError(Exception):
  function format_result (line 138) | def format_result(r):
  function format_stack_entry (line 148) | def format_stack_entry(r):
  function errok (line 174) | def errok():
  function restart (line 178) | def restart():
  function token (line 182) | def token():
  function call_errorfunc (line 187) | def call_errorfunc(errorfunc, token, parser):
  class YaccSymbol (line 216) | class YaccSymbol:
    method __str__ (line 217) | def __str__(self):
    method __repr__ (line 220) | def __repr__(self):
  class YaccProduction (line 232) | class YaccProduction:
    method __init__ (line 233) | def __init__(self, s, stack=None):
    method __getitem__ (line 239) | def __getitem__(self, n):
    method __setitem__ (line 247) | def __setitem__(self, n, v):
    method __getslice__ (line 250) | def __getslice__(self, i, j):
    method __len__ (line 253) | def __len__(self):
    method lineno (line 256) | def lineno(self, n):
    method set_lineno (line 259) | def set_lineno(self, n, lineno):
    method linespan (line 262) | def linespan(self, n):
    method lexpos (line 267) | def lexpos(self, n):
    method set_lexpos (line 270) | def set_lexpos(self, n, lexpos):
    method lexspan (line 273) | def lexspan(self, n):
    method error (line 278) | def error(self):
  class LRParser (line 287) | class LRParser:
    method __init__ (line 288) | def __init__(self, lrtab, errorf):
    method errok (line 296) | def errok(self):
    method restart (line 299) | def restart(self):
    method set_defaulted_states (line 315) | def set_defaulted_states(self):
    method disable_defaulted_states (line 322) | def disable_defaulted_states(self):
    method parse (line 325) | def parse(self, input=None, lexer=None, debug=False, tracking=False, t...
    method parsedebug (line 350) | def parsedebug(self, input=None, lexer=None, debug=False, tracking=Fal...
    method parseopt (line 699) | def parseopt(self, input=None, lexer=None, debug=False, tracking=False...
    method parseopt_notrack (line 1007) | def parseopt_notrack(self, input=None, lexer=None, debug=False, tracki...
  class Production (line 1315) | class Production(object):
    method __init__ (line 1317) | def __init__(self, number, name, prod, precedence=('right', 0), func=N...
    method __str__ (line 1347) | def __str__(self):
    method __repr__ (line 1350) | def __repr__(self):
    method __len__ (line 1353) | def __len__(self):
    method __nonzero__ (line 1356) | def __nonzero__(self):
    method __getitem__ (line 1359) | def __getitem__(self, index):
    method lr_item (line 1363) | def lr_item(self, n):
    method bind (line 1379) | def bind(self, pdict):
  class MiniProduction (line 1387) | class MiniProduction(object):
    method __init__ (line 1388) | def __init__(self, str, name, len, func, file, line):
    method __str__ (line 1397) | def __str__(self):
    method __repr__ (line 1400) | def __repr__(self):
    method bind (line 1404) | def bind(self, pdict):
  class LRItem (line 1433) | class LRItem(object):
    method __init__ (line 1434) | def __init__(self, p, n):
    method __str__ (line 1445) | def __str__(self):
    method __repr__ (line 1452) | def __repr__(self):
  function rightmost_terminal (line 1460) | def rightmost_terminal(symbols, terminals):
  class GrammarError (line 1476) | class GrammarError(YaccError):
  class Grammar (line 1479) | class Grammar(object):
    method __init__ (line 1480) | def __init__(self, terminals):
    method __len__ (line 1516) | def __len__(self):
    method __getitem__ (line 1519) | def __getitem__(self, index):
    method set_precedence (line 1530) | def set_precedence(self, term, assoc, level):
    method add_production (line 1555) | def add_production(self, prodname, syms, func=None, file='', line=0):
    method set_start (line 1639) | def set_start(self, start=None):
    method find_unreachable (line 1655) | def find_unreachable(self):
    method infinite_cycles (line 1678) | def infinite_cycles(self):
    method undefined_symbols (line 1742) | def undefined_symbols(self):
    method unused_terminals (line 1759) | def unused_terminals(self):
    method unused_rules (line 1774) | def unused_rules(self):
    method unused_precedence (line 1791) | def unused_precedence(self):
    method _first (line 1807) | def _first(self, beta):
    method compute_first (line 1842) | def compute_first(self):
    method compute_follow (line 1879) | def compute_follow(self, start=None):
    method build_lritems (line 1938) | def build_lritems(self):
  class VersionError (line 1974) | class VersionError(YaccError):
  class LRTable (line 1977) | class LRTable(object):
    method __init__ (line 1978) | def __init__(self):
    method read_table (line 1984) | def read_table(self, module):
    method read_pickle (line 2004) | def read_pickle(self, filename):
    method bind_callables (line 2032) | def bind_callables(self, pdict):
  function digraph (line 2061) | def digraph(X, R, FP):
  function traverse (line 2072) | def traverse(x, N, stack, F, X, R, FP):
  class LALRError (line 2095) | class LALRError(YaccError):
  class LRGeneratedTable (line 2105) | class LRGeneratedTable(LRTable):
    method __init__ (line 2106) | def __init__(self, grammar, method='LALR', log=None):
    method lr0_closure (line 2143) | def lr0_closure(self, I):
    method lr0_goto (line 2169) | def lr0_goto(self, I, x):
    method lr0_items (line 2204) | def lr0_items(self):
    method compute_nullable_nonterminals (line 2260) | def compute_nullable_nonterminals(self):
    method find_nonterminal_transitions (line 2289) | def find_nonterminal_transitions(self, C):
    method dr_relation (line 2309) | def dr_relation(self, C, trans, nullable):
    method reads_relation (line 2333) | def reads_relation(self, C, trans, empty):
    method compute_lookback_includes (line 2376) | def compute_lookback_includes(self, C, trans, nullable):
    method compute_read_sets (line 2456) | def compute_read_sets(self, C, ntrans, nullable):
    method compute_follow_sets (line 2478) | def compute_follow_sets(self, ntrans, readsets, inclsets):
    method add_lookaheads (line 2496) | def add_lookaheads(self, lookbacks, followset):
    method add_lalr_lookaheads (line 2514) | def add_lalr_lookaheads(self, C):
    method lr_parse_table (line 2538) | def lr_parse_table(self):
    method write_table (line 2731) | def write_table(self, tabmodule, outputdir='', signature=''):
    method pickle_table (line 2854) | def pickle_table(self, filename, signature=''):
  function get_caller_module_dict (line 2889) | def get_caller_module_dict(levels):
  function parse_grammar (line 2901) | def parse_grammar(doc, file, line):
  class ParserReflect (line 2942) | class ParserReflect(object):
    method __init__ (line 2943) | def __init__(self, pdict, log=None):
    method get_all (line 2958) | def get_all(self):
    method validate_all (line 2966) | def validate_all(self):
    method signature (line 2976) | def signature(self):
    method validate_modules (line 3003) | def validate_modules(self):
    method get_start (line 3028) | def get_start(self):
    method validate_start (line 3032) | def validate_start(self):
    method get_error_func (line 3038) | def get_error_func(self):
    method validate_error_func (line 3042) | def validate_error_func(self):
    method get_tokens (line 3064) | def get_tokens(self):
    method validate_tokens (line 3084) | def validate_tokens(self):
    method get_precedence (line 3098) | def get_precedence(self):
    method validate_precedence (line 3102) | def validate_precedence(self):
    method get_pfunctions (line 3133) | def get_pfunctions(self):
    method validate_pfunctions (line 3154) | def validate_pfunctions(self):
  function yacc (line 3220) | def yacc(method='LALR', debug=yaccdebug, module=None, tabmodule=tab_modu...

FILE: ctypesgen/printer_json/printer.py
  function todict (line 11) | def todict(obj, classkey="Klass"):
  class WrapperPrinter (line 37) | class WrapperPrinter:
    method __init__ (line 38) | def __init__(self, outpath, options, data):
    method __del__ (line 70) | def __del__(self):
    method print_group (line 73) | def print_group(self, list, name, function):
    method print_library (line 77) | def print_library(self, library):
    method print_constant (line 80) | def print_constant(self, constant):
    method print_undef (line 83) | def print_undef(self, undef):
    method print_typedef (line 86) | def print_typedef(self, typedef):
    method print_struct (line 89) | def print_struct(self, struct):
    method print_struct_members (line 100) | def print_struct_members(self, struct):
    method print_enum (line 103) | def print_enum(self, enum):
    method print_function (line 113) | def print_function(self, function):
    method print_variable (line 126) | def print_variable(self, variable):
    method print_macro (line 132) | def print_macro(self, macro):

FILE: ctypesgen/printer_python/preamble.py
  class UserString (line 19) | class UserString:
    method __init__ (line 20) | def __init__(self, seq):
    method __bytes__ (line 28) | def __bytes__(self):
    method __str__ (line 31) | def __str__(self):
    method __repr__ (line 34) | def __repr__(self):
    method __int__ (line 37) | def __int__(self):
    method __long__ (line 40) | def __long__(self):
    method __float__ (line 43) | def __float__(self):
    method __complex__ (line 46) | def __complex__(self):
    method __hash__ (line 49) | def __hash__(self):
    method __le__ (line 52) | def __le__(self, string):
    method __lt__ (line 58) | def __lt__(self, string):
    method __ge__ (line 64) | def __ge__(self, string):
    method __gt__ (line 70) | def __gt__(self, string):
    method __eq__ (line 76) | def __eq__(self, string):
    method __ne__ (line 82) | def __ne__(self, string):
    method __contains__ (line 88) | def __contains__(self, char):
    method __len__ (line 91) | def __len__(self):
    method __getitem__ (line 94) | def __getitem__(self, index):
    method __getslice__ (line 97) | def __getslice__(self, start, end):
    method __add__ (line 102) | def __add__(self, other):
    method __radd__ (line 110) | def __radd__(self, other):
    method __mul__ (line 116) | def __mul__(self, n):
    method __mod__ (line 121) | def __mod__(self, args):
    method capitalize (line 125) | def capitalize(self):
    method center (line 128) | def center(self, width, *args):
    method count (line 131) | def count(self, sub, start=0, end=sys.maxsize):
    method decode (line 134) | def decode(self, encoding=None, errors=None):  # XXX improve this?
    method encode (line 143) | def encode(self, encoding=None, errors=None):  # XXX improve this?
    method endswith (line 152) | def endswith(self, suffix, start=0, end=sys.maxsize):
    method expandtabs (line 155) | def expandtabs(self, tabsize=8):
    method find (line 158) | def find(self, sub, start=0, end=sys.maxsize):
    method index (line 161) | def index(self, sub, start=0, end=sys.maxsize):
    method isalpha (line 164) | def isalpha(self):
    method isalnum (line 167) | def isalnum(self):
    method isdecimal (line 170) | def isdecimal(self):
    method isdigit (line 173) | def isdigit(self):
    method islower (line 176) | def islower(self):
    method isnumeric (line 179) | def isnumeric(self):
    method isspace (line 182) | def isspace(self):
    method istitle (line 185) | def istitle(self):
    method isupper (line 188) | def isupper(self):
    method join (line 191) | def join(self, seq):
    method ljust (line 194) | def ljust(self, width, *args):
    method lower (line 197) | def lower(self):
    method lstrip (line 200) | def lstrip(self, chars=None):
    method partition (line 203) | def partition(self, sep):
    method replace (line 206) | def replace(self, old, new, maxsplit=-1):
    method rfind (line 209) | def rfind(self, sub, start=0, end=sys.maxsize):
    method rindex (line 212) | def rindex(self, sub, start=0, end=sys.maxsize):
    method rjust (line 215) | def rjust(self, width, *args):
    method rpartition (line 218) | def rpartition(self, sep):
    method rstrip (line 221) | def rstrip(self, chars=None):
    method split (line 224) | def split(self, sep=None, maxsplit=-1):
    method rsplit (line 227) | def rsplit(self, sep=None, maxsplit=-1):
    method splitlines (line 230) | def splitlines(self, keepends=0):
    method startswith (line 233) | def startswith(self, prefix, start=0, end=sys.maxsize):
    method strip (line 236) | def strip(self, chars=None):
    method swapcase (line 239) | def swapcase(self):
    method title (line 242) | def title(self):
    method translate (line 245) | def translate(self, *args):
    method upper (line 248) | def upper(self):
    method zfill (line 251) | def zfill(self, width):
  class MutableString (line 255) | class MutableString(UserString):
    method __init__ (line 271) | def __init__(self, string=""):
    method __hash__ (line 274) | def __hash__(self):
    method __setitem__ (line 277) | def __setitem__(self, index, sub):
    method __delitem__ (line 284) | def __delitem__(self, index):
    method __setslice__ (line 291) | def __setslice__(self, start, end, sub):
    method __delslice__ (line 301) | def __delslice__(self, start, end):
    method immutable (line 306) | def immutable(self):
    method __iadd__ (line 309) | def __iadd__(self, other):
    method __imul__ (line 318) | def __imul__(self, n):
  class String (line 323) | class String(MutableString, ctypes.Union):
    method __init__ (line 326) | def __init__(self, obj=b""):
    method __len__ (line 332) | def __len__(self):
    method from_param (line 335) | def from_param(cls, obj):
  function ReturnString (line 375) | def ReturnString(obj, func=None, arguments=None):
  function UNCHECKED (line 386) | def UNCHECKED(type):
  class _variadic_function (line 395) | class _variadic_function(object):
    method __init__ (line 396) | def __init__(self, func, restype, argtypes, errcheck):
    method _as_parameter_ (line 403) | def _as_parameter_(self):
    method __call__ (line 407) | def __call__(self, *args):
  function ord_if_char (line 417) | def ord_if_char(value):

FILE: ctypesgen/printer_python/printer.py
  class WrapperPrinter (line 19) | class WrapperPrinter:
    method __init__ (line 20) | def __init__(self, outpath, options, data):
    method __del__ (line 64) | def __del__(self):
    method print_group (line 67) | def print_group(self, list, name, function):
    method srcinfo (line 79) | def srcinfo(self, src):
    method template_subs (line 93) | def template_subs(self):
    method print_header (line 110) | def print_header(self):
    method print_preamble (line 132) | def print_preamble(self):
    method _copy_preamble_loader_files (line 145) | def _copy_preamble_loader_files(self, path):
    method print_loader (line 189) | def print_loader(self):
    method print_library (line 205) | def print_library(self, library):
    method print_module (line 208) | def print_module(self, module):
    method print_constant (line 211) | def print_constant(self, constant):
    method print_undef (line 215) | def print_undef(self, undef):
    method print_typedef (line 225) | def print_typedef(self, typedef):
    method print_struct (line 229) | def print_struct(self, struct):
    method print_struct_members (line 234) | def print_struct_members(self, struct):
    method print_enum (line 289) | def print_enum(self, enum):
    method print_function (line 294) | def print_function(self, function):
    method print_fixed_function (line 300) | def print_fixed_function(self, function):
    method print_variadic_function (line 355) | def print_variadic_function(self, function):
    method print_variable (line 394) | def print_variable(self, variable):
    method print_macro (line 420) | def print_macro(self, macro):
    method print_simple_macro (line 428) | def print_simple_macro(self, macro):
    method print_func_macro (line 440) | def print_func_macro(self, macro):
    method strip_prefixes (line 449) | def strip_prefixes(self):
    method insert_file (line 475) | def insert_file(self, filename):

FILE: ctypesgen/processor/dependencies.py
  function find_dependencies (line 10) | def find_dependencies(data, opts):

FILE: ctypesgen/processor/operations.py
  function automatically_typedef_structs (line 23) | def automatically_typedef_structs(data, options):
  function remove_NULL (line 38) | def remove_NULL(data, options):
  function remove_descriptions_in_system_headers (line 47) | def remove_descriptions_in_system_headers(data, opts):
  function remove_macros (line 67) | def remove_macros(data, opts):
  function filter_by_regexes_exclude (line 74) | def filter_by_regexes_exclude(data, opts):
  function filter_by_regexes_include (line 84) | def filter_by_regexes_include(data, opts):
  function fix_conflicting_names (line 95) | def fix_conflicting_names(data, opts):
  function find_source_libraries (line 255) | def find_source_libraries(data, opts):

FILE: ctypesgen/processor/pipeline.py
  function process (line 52) | def process(data, options):
  function calculate_final_inclusion (line 73) | def calculate_final_inclusion(data, opts):
  function print_errors_encountered (line 110) | def print_errors_encountered(data, opts):

FILE: ctypesgen/version.py
  function version_tuple (line 14) | def version_tuple(v):
  function read_file_version (line 25) | def read_file_version():
  function version (line 32) | def version():
  function version_number (line 50) | def version_number():
  function compatible (line 54) | def compatible(v0, v1):
  function write_version_file (line 60) | def write_version_file(v=None):

FILE: demo/demoapp.c
  function main (line 17) | int main(int argc, char **argv)

FILE: demo/demoapp.py
  function do_demo (line 19) | def do_demo():
  function main (line 28) | def main(argv=None):

FILE: demo/demolib.c
  function trivial_add (line 17) | int trivial_add(int a, int b)

FILE: demo/pydemolib.py
  class UserString (line 31) | class UserString:
    method __init__ (line 32) | def __init__(self, seq):
    method __bytes__ (line 40) | def __bytes__(self):
    method __str__ (line 43) | def __str__(self):
    method __repr__ (line 46) | def __repr__(self):
    method __int__ (line 49) | def __int__(self):
    method __long__ (line 52) | def __long__(self):
    method __float__ (line 55) | def __float__(self):
    method __complex__ (line 58) | def __complex__(self):
    method __hash__ (line 61) | def __hash__(self):
    method __le__ (line 64) | def __le__(self, string):
    method __lt__ (line 70) | def __lt__(self, string):
    method __ge__ (line 76) | def __ge__(self, string):
    method __gt__ (line 82) | def __gt__(self, string):
    method __eq__ (line 88) | def __eq__(self, string):
    method __ne__ (line 94) | def __ne__(self, string):
    method __contains__ (line 100) | def __contains__(self, char):
    method __len__ (line 103) | def __len__(self):
    method __getitem__ (line 106) | def __getitem__(self, index):
    method __getslice__ (line 109) | def __getslice__(self, start, end):
    method __add__ (line 114) | def __add__(self, other):
    method __radd__ (line 122) | def __radd__(self, other):
    method __mul__ (line 128) | def __mul__(self, n):
    method __mod__ (line 133) | def __mod__(self, args):
    method capitalize (line 137) | def capitalize(self):
    method center (line 140) | def center(self, width, *args):
    method count (line 143) | def count(self, sub, start=0, end=sys.maxsize):
    method decode (line 146) | def decode(self, encoding=None, errors=None):  # XXX improve this?
    method encode (line 155) | def encode(self, encoding=None, errors=None):  # XXX improve this?
    method endswith (line 164) | def endswith(self, suffix, start=0, end=sys.maxsize):
    method expandtabs (line 167) | def expandtabs(self, tabsize=8):
    method find (line 170) | def find(self, sub, start=0, end=sys.maxsize):
    method index (line 173) | def index(self, sub, start=0, end=sys.maxsize):
    method isalpha (line 176) | def isalpha(self):
    method isalnum (line 179) | def isalnum(self):
    method isdecimal (line 182) | def isdecimal(self):
    method isdigit (line 185) | def isdigit(self):
    method islower (line 188) | def islower(self):
    method isnumeric (line 191) | def isnumeric(self):
    method isspace (line 194) | def isspace(self):
    method istitle (line 197) | def istitle(self):
    method isupper (line 200) | def isupper(self):
    method join (line 203) | def join(self, seq):
    method ljust (line 206) | def ljust(self, width, *args):
    method lower (line 209) | def lower(self):
    method lstrip (line 212) | def lstrip(self, chars=None):
    method partition (line 215) | def partition(self, sep):
    method replace (line 218) | def replace(self, old, new, maxsplit=-1):
    method rfind (line 221) | def rfind(self, sub, start=0, end=sys.maxsize):
    method rindex (line 224) | def rindex(self, sub, start=0, end=sys.maxsize):
    method rjust (line 227) | def rjust(self, width, *args):
    method rpartition (line 230) | def rpartition(self, sep):
    method rstrip (line 233) | def rstrip(self, chars=None):
    method split (line 236) | def split(self, sep=None, maxsplit=-1):
    method rsplit (line 239) | def rsplit(self, sep=None, maxsplit=-1):
    method splitlines (line 242) | def splitlines(self, keepends=0):
    method startswith (line 245) | def startswith(self, prefix, start=0, end=sys.maxsize):
    method strip (line 248) | def strip(self, chars=None):
    method swapcase (line 251) | def swapcase(self):
    method title (line 254) | def title(self):
    method translate (line 257) | def translate(self, *args):
    method upper (line 260) | def upper(self):
    method zfill (line 263) | def zfill(self, width):
  class MutableString (line 267) | class MutableString(UserString):
    method __init__ (line 283) | def __init__(self, string=""):
    method __hash__ (line 286) | def __hash__(self):
    method __setitem__ (line 289) | def __setitem__(self, index, sub):
    method __delitem__ (line 296) | def __delitem__(self, index):
    method __setslice__ (line 303) | def __setslice__(self, start, end, sub):
    method __delslice__ (line 313) | def __delslice__(self, start, end):
    method immutable (line 318) | def immutable(self):
    method __iadd__ (line 321) | def __iadd__(self, other):
    method __imul__ (line 330) | def __imul__(self, n):
  class String (line 335) | class String(MutableString, ctypes.Union):
    method __init__ (line 339) | def __init__(self, obj=b""):
    method __len__ (line 345) | def __len__(self):
    method from_param (line 348) | def from_param(cls, obj):
  function ReturnString (line 388) | def ReturnString(obj, func=None, arguments=None):
  function UNCHECKED (line 399) | def UNCHECKED(type):
  class _variadic_function (line 408) | class _variadic_function(object):
    method __init__ (line 409) | def __init__(self, func, restype, argtypes, errcheck):
    method _as_parameter_ (line 416) | def _as_parameter_(self):
    method __call__ (line 420) | def __call__(self, *args):
  function ord_if_char (line 430) | def ord_if_char(value):
  function _environ_path (line 493) | def _environ_path(name):
  class LibraryLoader (line 500) | class LibraryLoader:
    class Lookup (line 509) | class Lookup:
      method __init__ (line 514) | def __init__(self, path):
      method get (line 518) | def get(self, name, calling_convention="cdecl"):
      method has (line 528) | def has(self, name, calling_convention="cdecl"):
      method __getattr__ (line 534) | def __getattr__(self, name):
    method __init__ (line 537) | def __init__(self):
    method __call__ (line 540) | def __call__(self, libname):
    method getpaths (line 553) | def getpaths(self, libname):
    method getplatformpaths (line 591) | def getplatformpaths(self, _libname):  # pylint: disable=no-self-use
  class DarwinLibraryLoader (line 599) | class DarwinLibraryLoader(LibraryLoader):
    class Lookup (line 612) | class Lookup(LibraryLoader.Lookup):
    method getplatformpaths (line 623) | def getplatformpaths(self, libname):
    method getdirs (line 634) | def getdirs(libname):
  class PosixLibraryLoader (line 673) | class PosixLibraryLoader(LibraryLoader):
    class _Directories (line 682) | class _Directories(dict):
      method __init__ (line 685) | def __init__(self):
      method add (line 689) | def add(self, directory):
      method extend (line 700) | def extend(self, directories):
      method ordered (line 705) | def ordered(self):
    method _get_ld_so_conf_dirs (line 709) | def _get_ld_so_conf_dirs(self, conf, dirs):
    method _create_ld_so_cache (line 731) | def _create_ld_so_cache(self):
    method getplatformpaths (line 801) | def getplatformpaths(self, libname):
  class WindowsLibraryLoader (line 816) | class WindowsLibraryLoader(LibraryLoader):
    class Lookup (line 821) | class Lookup(LibraryLoader.Lookup):
      method __init__ (line 824) | def __init__(self, path):
  function add_library_search_dirs (line 844) | def add_library_search_dirs(other_dirs):

FILE: tests/ctypesgentest.py
  function redirect (line 29) | def redirect(stdout=sys.stdout):
  function generate (line 38) | def generate(header, **more_options):
  function cleanup (line 98) | def cleanup(filepattern="temp.*"):
  function set_logging_level (line 104) | def set_logging_level(log_level):
  function ctypesgen_version (line 108) | def ctypesgen_version():
  function sort_anon_fn (line 112) | def sort_anon_fn(anon_tag):
  class JsonHelper (line 116) | class JsonHelper:
    method __init__ (line 126) | def __init__(self):
    method prepare (line 129) | def prepare(self, json):
    method _replace_anon_tag (line 144) | def _replace_anon_tag(self, json, tag, new_tag):
    method _search_anon_tags (line 164) | def _search_anon_tags(self, json):
  function generate_common (line 187) | def generate_common():
  function cleanup_common (line 201) | def cleanup_common():
  function _compile_common (line 208) | def _compile_common(common_lib):
  function _generate_common (line 223) | def _generate_common(file_name, common_lib, embed_preamble=True):
  function _create_common_files (line 240) | def _create_common_files():

FILE: tests/testsuite.py
  function compare_json (line 48) | def compare_json(test_instance, json, json_ans, verbose=False):
  function compute_packed (line 91) | def compute_packed(modulo, fields):
  class StdlibTest (line 102) | class StdlibTest(unittest.TestCase):
    method setUpClass (line 104) | def setUpClass(cls):
    method tearDownClass (line 117) | def tearDownClass(cls):
    method test_getenv_returns_string (line 121) | def test_getenv_returns_string(self):
    method test_getenv_returns_null (line 148) | def test_getenv_returns_null(self):
  class CommonHeaderTest (line 176) | class CommonHeaderTest(unittest.TestCase):
    method setUpClass (line 178) | def setUpClass(cls):
    method tearDownClass (line 182) | def tearDownClass(cls):
    method test_two_import_with_embedded_preamble (line 186) | def test_two_import_with_embedded_preamble(self):
    method test_one_import (line 194) | def test_one_import(self):
    method test_two_import (line 200) | def test_two_import(self):
  class StdBoolTest (line 209) | class StdBoolTest(unittest.TestCase):
    method setUpClass (line 213) | def setUpClass(cls):
    method tearDownClass (line 226) | def tearDownClass(cls):
    method test_stdbool_type (line 230) | def test_stdbool_type(self):
  class IntTypesTest (line 237) | class IntTypesTest(unittest.TestCase):
    method setUpClass (line 241) | def setUpClass(cls):
    method tearDownClass (line 260) | def tearDownClass(cls):
    method test_int_types (line 264) | def test_int_types(self):
  class SimpleMacrosTest (line 286) | class SimpleMacrosTest(unittest.TestCase):
    method setUpClass (line 290) | def setUpClass(cls):
    method _json (line 308) | def _json(self, name):
    method tearDownClass (line 315) | def tearDownClass(cls):
    method test_macro_constant_int (line 319) | def test_macro_constant_int(self):
    method test_macro_addition_json (line 326) | def test_macro_addition_json(self):
    method test_macro_addition (line 334) | def test_macro_addition(self):
    method test_macro_ternary_json (line 340) | def test_macro_ternary_json(self):
    method test_macro_ternary_true (line 354) | def test_macro_ternary_true(self):
    method test_macro_ternary_false (line 360) | def test_macro_ternary_false(self):
    method test_macro_ternary_true_complex (line 366) | def test_macro_ternary_true_complex(self):
    method test_macro_ternary_false_complex (line 372) | def test_macro_ternary_false_complex(self):
    method test_macro_string_compose (line 378) | def test_macro_string_compose(self):
    method test_macro_string_compose_json (line 384) | def test_macro_string_compose_json(self):
    method test_macro_math_multipler (line 393) | def test_macro_math_multipler(self):
    method test_macro_math_multiplier_json (line 399) | def test_macro_math_multiplier_json(self):
    method test_macro_math_minus (line 412) | def test_macro_math_minus(self):
    method test_macro_math_minus_json (line 418) | def test_macro_math_minus_json(self):
    method test_macro_math_divide (line 431) | def test_macro_math_divide(self):
    method test_macro_math_divide_json (line 437) | def test_macro_math_divide_json(self):
    method test_macro_math_mod (line 450) | def test_macro_math_mod(self):
    method test_macro_math_mod_json (line 456) | def test_macro_math_mod_json(self):
    method test_macro_subcall_simple (line 464) | def test_macro_subcall_simple(self):
    method test_macro_subcall_simple_json (line 470) | def test_macro_subcall_simple_json(self):
    method test_macro_subcall_simple_plus (line 478) | def test_macro_subcall_simple_plus(self):
    method test_macro_subcall_simple_plus_json (line 484) | def test_macro_subcall_simple_plus_json(self):
    method test_macro_subcall_minus (line 497) | def test_macro_subcall_minus(self):
    method test_macro_subcall_minus_json (line 504) | def test_macro_subcall_minus_json(self):
    method test_macro_subcall_minus_plus (line 517) | def test_macro_subcall_minus_plus(self):
    method test_macro_subcall_minus_plus_json (line 524) | def test_macro_subcall_minus_plus_json(self):
  class StructuresTest (line 538) | class StructuresTest(unittest.TestCase):
    method setUpClass (line 542) | def setUpClass(cls):
    method tearDownClass (line 637) | def tearDownClass(cls):
    method test_struct_json (line 641) | def test_struct_json(self):
    method test_fields (line 1960) | def test_fields(self):
    method test_pack (line 1974) | def test_pack(self):
    method test_pragma_pack (line 1993) | def test_pragma_pack(self):
    method test_typedef_vs_field_id (line 2012) | def test_typedef_vs_field_id(self):
    method test_anonymous_tag_uniformity (line 2022) | def test_anonymous_tag_uniformity(self):
  class MathTest (line 2032) | class MathTest(unittest.TestCase):
    method setUpClass (line 2036) | def setUpClass(cls):
    method tearDownClass (line 2052) | def tearDownClass(cls):
    method test_sin (line 2056) | def test_sin(self):
    method test_sqrt (line 2062) | def test_sqrt(self):
    method test_bad_args_string_not_number (line 2073) | def test_bad_args_string_not_number(self):
    method test_subcall_sin (line 2082) | def test_subcall_sin(self):
  class EnumTest (line 2089) | class EnumTest(unittest.TestCase):
    method setUpClass (line 2091) | def setUpClass(cls):
    method tearDownClass (line 2102) | def tearDownClass(cls):
    method test_enum (line 2106) | def test_enum(self):
    method test_enum_json (line 2110) | def test_enum_json(self):
  class PrototypeTest (line 2199) | class PrototypeTest(unittest.TestCase):
    method setUpClass (line 2201) | def setUpClass(cls):
    method tearDownClass (line 2214) | def tearDownClass(cls):
    method test_function_prototypes_json (line 2218) | def test_function_prototypes_json(self):
  class LongDoubleTest (line 2352) | class LongDoubleTest(unittest.TestCase):
    method setUpClass (line 2356) | def setUpClass(cls):
    method tearDownClass (line 2367) | def tearDownClass(cls):
    method test_longdouble_type (line 2371) | def test_longdouble_type(self):
  class MainTest (line 2380) | class MainTest(unittest.TestCase):
    method _exec (line 2394) | def _exec(args):
    method test_version (line 2400) | def test_version(self):
    method test_help (line 2407) | def test_help(self):
    method test_invalid_option (line 2417) | def test_invalid_option(self):
  class UncheckedTest (line 2428) | class UncheckedTest(unittest.TestCase):
    method setUpClass (line 2432) | def setUpClass(cls):
    method test_unchecked_prototype (line 2438) | def test_unchecked_prototype(self):
    method tearDownClass (line 2447) | def tearDownClass(cls):
  class ConstantsTest (line 2452) | class ConstantsTest(unittest.TestCase):
    method setUpClass (line 2456) | def setUpClass(cls):
    method tearDownClass (line 2487) | def tearDownClass(cls):
    method test_integer_constants (line 2491) | def test_integer_constants(self):
    method test_floating_constants (line 2502) | def test_floating_constants(self):
    method test_struct_fields (line 2511) | def test_struct_fields(self):
    method test_character_constants (line 2525) | def test_character_constants(self):
  class NULLTest (line 2530) | class NULLTest(unittest.TestCase):
    method setUpClass (line 2534) | def setUpClass(cls):
    method tearDownClass (line 2539) | def tearDownClass(cls):
    method test_null_type (line 2543) | def test_null_type(self):
  class MacromanEncodeTest (line 2549) | class MacromanEncodeTest(unittest.TestCase):
    method setUpClass (line 2556) | def setUpClass(cls):
    method tearDownClass (line 2576) | def tearDownClass(cls):
    method test_macroman_encoding_source (line 2581) | def test_macroman_encoding_source(self):
  function main (line 2587) | def main(argv=None):
Condensed preview — 70 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (759K chars).
[
  {
    "path": ".flake8",
    "chars": 626,
    "preview": "[flake8]\nignore =\n    # whitespace before ':' (Black)\n    E203,\n    # line break before binary operator (Black)\n    W503"
  },
  {
    "path": ".github/workflows/black.yml",
    "chars": 691,
    "preview": "---\nname: Python Black Formatting\n\non:\n  - push\n  - pull_request\n  - fork\n\njobs:\n  black:\n    name: Black\n    runs-on: u"
  },
  {
    "path": ".github/workflows/flake8.yml",
    "chars": 543,
    "preview": "---\nname: Python Flake8 Code Quality\n\non:\n  - push\n  - pull_request\n  - fork\n\njobs:\n  flake8:\n    name: ${{ matrix.direc"
  },
  {
    "path": ".github/workflows/publish.yml",
    "chars": 915,
    "preview": "---\nname: Publish Python distributions to PyPI\n\non:\n  release:\n    types: [published]\njobs:\n  build-n-publish:\n    name:"
  },
  {
    "path": ".github/workflows/test.yml",
    "chars": 1020,
    "preview": "---\nname: Test\n\non:\n  - push\n  - pull_request\n  - fork\n\njobs:\n  setup-and-test:\n    name: Python-${{ matrix.python }} ${"
  },
  {
    "path": ".gitignore",
    "chars": 238,
    "preview": "# precompiled python files\n*.pyc\n\n# generated by distutils\nMANIFEST\ndist/\n\n# generated by setuptools\nbuild/\nctypesgen.eg"
  },
  {
    "path": ".travis.yml",
    "chars": 943,
    "preview": "dist: bionic\nlanguage: python\npython: 3.7.8\n\ninstall:\n  - pip install tox\nscript:\n  - tox\n\nstages:\n  - name: tox\n  - nam"
  },
  {
    "path": "CHANGELOG.md",
    "chars": 2072,
    "preview": "## Change Log\n\n### Unreleased\n\n### v1.1.1\n\n- Fixed inconsistency in version output in released packages\n\n### v1.1.0\n\nThi"
  },
  {
    "path": "CONTRIBUTING",
    "chars": 589,
    "preview": "The best way to document a bug is to create a new test which demonstrates it. You should do that by adding a new test to"
  },
  {
    "path": "LICENSE",
    "chars": 1312,
    "preview": "Copyright (c) 2007-2022, Ctypesgen Developers\nAll rights reserved.\n\nRedistribution and use in source and binary forms, w"
  },
  {
    "path": "MANIFEST.in",
    "chars": 106,
    "preview": "graft ctypesgen\nrecursive-exclude ctypesgen .gitignore\nglobal-exclude *.py[cod]\ninclude ctypesgen/VERSION\n"
  },
  {
    "path": "README.md",
    "chars": 2365,
    "preview": "                              ctypesgen\n                              ---------\n\n                  (c) Ctypesgen develop"
  },
  {
    "path": "ctypesgen/.gitignore",
    "chars": 8,
    "preview": "VERSION\n"
  },
  {
    "path": "ctypesgen/__init__.py",
    "chars": 2149,
    "preview": "\"\"\"\nCtypesgencore is the module that contains the main body of ctypesgen - in fact,\nit contains everything but the comma"
  },
  {
    "path": "ctypesgen/__main__.py",
    "chars": 10846,
    "preview": "\"\"\"\nCommand-line interface for ctypesgen\n\"\"\"\n\nimport argparse\n\nfrom ctypesgen import (\n    messages as msgs,\n    options"
  },
  {
    "path": "ctypesgen/ctypedescs.py",
    "chars": 11685,
    "preview": "\"\"\"\nctypesgen.ctypedescs contains classes to represent a C type. All of them\nclasses are subclasses of CtypesType.\n\nUnli"
  },
  {
    "path": "ctypesgen/descriptions.py",
    "chars": 7813,
    "preview": "\"\"\"\nctypesgen.descriptions contains classes to represent a description of a\nstruct, union, enum, function, constant, var"
  },
  {
    "path": "ctypesgen/expressions.py",
    "chars": 11117,
    "preview": "\"\"\"\nThe expressions module contains classes to represent an expression. The main\nclass is ExpressionNode. ExpressionNode"
  },
  {
    "path": "ctypesgen/libraryloader.py",
    "chars": 14059,
    "preview": "\"\"\"\nLoad libraries - appropriately for all our supported platforms\n\"\"\"\n# -----------------------------------------------"
  },
  {
    "path": "ctypesgen/messages.py",
    "chars": 1351,
    "preview": "\"\"\"\nctypesgen.messages contains functions to display status, error, or warning\nmessages to the user. Warning and error m"
  },
  {
    "path": "ctypesgen/options.py",
    "chars": 1353,
    "preview": "\"\"\"\nAll of the components of ctypegencore require an argument called \"options\".\nIn command-line usage, this would be an "
  },
  {
    "path": "ctypesgen/parser/.gitignore",
    "chars": 27,
    "preview": "new_parsetab.py\nparser.out\n"
  },
  {
    "path": "ctypesgen/parser/__init__.py",
    "chars": 704,
    "preview": "\"\"\"\nThis package parses C header files and generates lists of functions, typedefs,\nvariables, structs, unions, enums, ma"
  },
  {
    "path": "ctypesgen/parser/cdeclarations.py",
    "chars": 7975,
    "preview": "\"\"\"\nThis file contains classes that represent C declarations. cparser produces\ndeclarations in this format, and ctypespa"
  },
  {
    "path": "ctypesgen/parser/cgrammar.py",
    "chars": 45467,
    "preview": "#!/usr/bin/env python3\n\n\"\"\"This is a yacc grammar for C.\n\nDerived from ANSI C grammar:\n  * Lexicon: http://www.lysator.l"
  },
  {
    "path": "ctypesgen/parser/cparser.py",
    "chars": 7386,
    "preview": "\"\"\"\nParse a C source file.\n\nTo use, subclass CParser and override its handle_* methods.  Then instantiate\nthe class with"
  },
  {
    "path": "ctypesgen/parser/ctypesparser.py",
    "chars": 8668,
    "preview": "\"\"\"\nctypesgen.parser.ctypesparser contains a class, CtypesParser, which is a\nsubclass of ctypesgen.parser.cparser.CParse"
  },
  {
    "path": "ctypesgen/parser/datacollectingparser.py",
    "chars": 12675,
    "preview": "\"\"\"\nDataCollectingParser subclasses ctypesparser.CtypesParser and builds Description\nobjects from the CtypesType objects"
  },
  {
    "path": "ctypesgen/parser/lex.py",
    "chars": 42938,
    "preview": "# -----------------------------------------------------------------------------\n# ply: lex.py\n#\n# Copyright (C) 2001-201"
  },
  {
    "path": "ctypesgen/parser/lextab.py",
    "chars": 13550,
    "preview": "# lextab.py. This file automatically created by PLY (version 3.11). Don't edit!\n_tabversion   = '3.10'\n_lextokens    = s"
  },
  {
    "path": "ctypesgen/parser/parsetab.py",
    "chars": 162925,
    "preview": "\n# new_parsetab.py\n# This file is automatically generated. Do not edit.\n# pylint: disable=W,C,R\n_tabversion = '3.10'\n\n_l"
  },
  {
    "path": "ctypesgen/parser/pplexer.py",
    "chars": 8923,
    "preview": "\"\"\"Preprocess a C source file using gcc and convert the result into\n   a token stream\n\nReference is C99 with additions f"
  },
  {
    "path": "ctypesgen/parser/preprocessor.py",
    "chars": 6233,
    "preview": "\"\"\"Preprocess a C source file using gcc and convert the result into\n   a token stream\n\nReference is C99:\n  * http://www."
  },
  {
    "path": "ctypesgen/parser/yacc.py",
    "chars": 138074,
    "preview": "# -----------------------------------------------------------------------------\n# ply: yacc.py\n#\n# Copyright (C) 2001-20"
  },
  {
    "path": "ctypesgen/printer_json/__init__.py",
    "chars": 175,
    "preview": "\"\"\"\nThis module is the backend to ctypesgen; it contains classes to\nproduce the final .py output files.\n\"\"\"\n\nfrom .print"
  },
  {
    "path": "ctypesgen/printer_json/printer.py",
    "chars": 4925,
    "preview": "import os\nimport sys\nimport json\n\nfrom ctypesgen.ctypedescs import CtypesBitfield\nfrom ctypesgen.messages import status_"
  },
  {
    "path": "ctypesgen/printer_python/__init__.py",
    "chars": 175,
    "preview": "\"\"\"\nThis module is the backend to ctypesgen; it contains classes to\nproduce the final .py output files.\n\"\"\"\n\nfrom .print"
  },
  {
    "path": "ctypesgen/printer_python/defaultheader.py",
    "chars": 117,
    "preview": "r\"\"\"Wrapper for %(name)s\n\nGenerated with:\n%(argv)s\n\nDo not modify this file.\n\"\"\"\n\n__docformat__ = \"restructuredtext\"\n"
  },
  {
    "path": "ctypesgen/printer_python/preamble.py",
    "chars": 12544,
    "preview": "import ctypes\nimport sys\nfrom ctypes import *  # noqa: F401, F403\n\n_int_types = (ctypes.c_int16, ctypes.c_int32)\nif hasa"
  },
  {
    "path": "ctypesgen/printer_python/printer.py",
    "chars": 17785,
    "preview": "import os\nimport os.path\nimport sys\nimport time\nimport shutil\n\nfrom ctypesgen.ctypedescs import CtypesBitfield, CtypesSt"
  },
  {
    "path": "ctypesgen/processor/__init__.py",
    "chars": 249,
    "preview": "\"\"\"\nThis module contains functions to operate on the DeclarationCollection produced\nby the parser module and prepare it "
  },
  {
    "path": "ctypesgen/processor/dependencies.py",
    "chars": 6015,
    "preview": "\"\"\"\nThe dependencies module determines which descriptions depend on which other\ndescriptions.\n\"\"\"\n\nfrom ctypesgen.descri"
  },
  {
    "path": "ctypesgen/processor/operations.py",
    "chars": 9657,
    "preview": "\"\"\"\nThe operations module contains various functions to process the\nDescriptionCollection and prepare it for output.\ncty"
  },
  {
    "path": "ctypesgen/processor/pipeline.py",
    "chars": 5895,
    "preview": "\"\"\"\nA brief explanation of the processing steps:\n1. The dependencies module builds a dependency graph for the descriptio"
  },
  {
    "path": "ctypesgen/version.py",
    "chars": 2052,
    "preview": "#!/usr/bin/env python3\n\nfrom subprocess import Popen, PIPE\nimport os\nfrom os import path\n\nTHIS_DIR = path.dirname(__file"
  },
  {
    "path": "debian/.gitignore",
    "chars": 134,
    "preview": "changelog\ncopyright\nctypesgen/\n*.log\n*.substvars\nfiles\ndebhelper-build-stamp\n*.postinst.debhelper\n*.prerm.debhelper\n*.sw"
  },
  {
    "path": "debian/compat",
    "chars": 2,
    "preview": "9\n"
  },
  {
    "path": "debian/control",
    "chars": 1083,
    "preview": "Source: ctypesgen\nSection: universe/python\nPriority: optional\nBuild-Depends: debhelper (>= 9), dh-python,\n python3-all ("
  },
  {
    "path": "debian/ctypesgen.docs",
    "chars": 6,
    "preview": "demo/\n"
  },
  {
    "path": "debian/ctypesgen.manpages",
    "chars": 19,
    "preview": "debian/ctypesgen.1\n"
  },
  {
    "path": "debian/mk_changelog",
    "chars": 3832,
    "preview": "#!/usr/bin/env python3\n# vim: ts=2:sw=2:et:tw=80:nowrap\n\nfrom subprocess import Popen, PIPE\nimport re, io\nfrom datetime "
  },
  {
    "path": "debian/mk_manpage",
    "chars": 1512,
    "preview": "#!/usr/bin/env python3\n\"\"\"\nThis script generates a manual page.\n\"\"\"\n\nfrom os import path, system, unlink, mkdir, rmdir\ni"
  },
  {
    "path": "debian/rules",
    "chars": 1266,
    "preview": "#!/usr/bin/make -f\n# debian/rules\n# -*- makefile -*-\n\nexport DH_VERBOSE=1\nDH_VERBOSE = 1\n\n# see EXAMPLES in dpkg-buildfl"
  },
  {
    "path": "demo/.gitignore",
    "chars": 29,
    "preview": "demoapp\ndemolib.o\ndemolib.so\n"
  },
  {
    "path": "demo/README.md",
    "chars": 1202,
    "preview": "Small Demonstration of Ctypesgen\n================================\n\nThis little demonstration was originally written by d"
  },
  {
    "path": "demo/demoapp.c",
    "chars": 513,
    "preview": "/*\n** Trivial ctypesgen demo library consumer\n**  from http://code.google.com/p/ctypesgen\n**\n** This demoapp it self is "
  },
  {
    "path": "demo/demoapp.py",
    "chars": 624,
    "preview": "#!/usr/bin/env python3\n\"\"\"\nTrivial ctypesgen demo library consumer\nfrom http://code.google.com/p/ctypesgen\n\n NOTE demoli"
  },
  {
    "path": "demo/demolib.c",
    "chars": 301,
    "preview": "/*\n** Trivial ctypesgen demo library\n**  from http://code.google.com/p/ctypesgen\n\nDumb manual build with:\n\n\n    gcc -fPI"
  },
  {
    "path": "demo/demolib.h",
    "chars": 116,
    "preview": "/*\n** Trivial ctypesgen demo library\n**  from http://code.google.com/p/ctypesgen\n*/\n\nint trivial_add(int a, int b);\n"
  },
  {
    "path": "demo/pydemolib.py",
    "chars": 27283,
    "preview": "r\"\"\"Wrapper for demolib.h\n\nGenerated with:\n../run.py -o pydemolib.py -l demolib.so demolib.h\n\nDo not modify this file.\n\""
  },
  {
    "path": "docs/publishing.md",
    "chars": 2561,
    "preview": "# How to Publish a New Release\n\n## Versioning\n\nVersioning within ctypesgen follows these general rules:\n\n* Versions are "
  },
  {
    "path": "pyproject.toml",
    "chars": 1762,
    "preview": "[build-system]\nrequires = [\"setuptools>=64\", \"setuptools_scm>=7.1\"]\nbuild-backend = \"setuptools.build_meta\"\n\n[project]\nn"
  },
  {
    "path": "run.py",
    "chars": 250,
    "preview": "#!/usr/bin/env python3\n\nimport sys\nimport os\n\n# ensure that we can load the ctypesgen library\nTHIS_DIR = os.path.dirname"
  },
  {
    "path": "setup.py",
    "chars": 93,
    "preview": "#!/usr/bin/env python3\n\nfrom setuptools import setup\n\nif __name__ == \"__main__\":\n    setup()\n"
  },
  {
    "path": "tests/.gitignore",
    "chars": 23,
    "preview": "temp.h\ntemp.py\ncommon/\n"
  },
  {
    "path": "tests/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "tests/ctypesgentest.py",
    "chars": 8124,
    "preview": "\"\"\"ctypesgentest is a simple module for testing ctypesgen on various C constructs.\n\nIt consists of a single function, te"
  },
  {
    "path": "tests/testsuite.py",
    "chars": 90411,
    "preview": "#!/usr/bin/env python3\n\"\"\"Simple test suite using unittest.\nBy clach04 (Chris Clark).\n\nCalling:\n\n    python3 -m unittest"
  },
  {
    "path": "todo.txt",
    "chars": 136,
    "preview": "1. Convert defines from \"errno.h\" into imports from the Python errno module.\n2. Search through code for \"XXX\" and see wh"
  },
  {
    "path": "tox.ini",
    "chars": 716,
    "preview": "[tox]\nenvlist = py37, py38, py39\nskip_missing_interpreters = true\n\n[testenv]\ndeps =\n    pytest\ncommands =\n    pytest -v "
  }
]

About this extraction

This page contains the full source code of the davidjamesca/ctypesgen GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 70 files (711.8 KB), approximately 191.8k tokens, and a symbol index with 990 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!