master ea4c4d66d316 cached
50 files
144.7 KB
36.5k tokens
236 symbols
1 requests
Download .txt
Repository: xmlrunner/unittest-xml-reporting
Branch: master
Commit: ea4c4d66d316
Files: 50
Total size: 144.7 KB

Directory structure:
gitextract_u1gfj0s8/

├── .coveragerc
├── .github/
│   ├── ISSUE_TEMPLATE/
│   │   ├── bug_report.md
│   │   └── feature_request.md
│   ├── pull_request_template.md
│   └── workflows/
│       └── tests.yml
├── .gitignore
├── .landscape.yml
├── LICENSE
├── MANIFEST.in
├── Makefile
├── README.md
├── docs/
│   ├── Makefile
│   ├── conf.py
│   └── index.rst
├── pyproject.toml
├── tests/
│   ├── __init__.py
│   ├── builder_test.py
│   ├── discovery_test.py
│   ├── django_example/
│   │   ├── app/
│   │   │   ├── __init__.py
│   │   │   ├── admin.py
│   │   │   ├── migrations/
│   │   │   │   └── __init__.py
│   │   │   ├── models.py
│   │   │   ├── tests.py
│   │   │   └── views.py
│   │   ├── app2/
│   │   │   ├── __init__.py
│   │   │   ├── admin.py
│   │   │   ├── migrations/
│   │   │   │   └── __init__.py
│   │   │   ├── models.py
│   │   │   ├── tests.py
│   │   │   └── views.py
│   │   ├── example/
│   │   │   ├── __init__.py
│   │   │   ├── settings.py
│   │   │   ├── urls.py
│   │   │   └── wsgi.py
│   │   └── manage.py
│   ├── django_test.py
│   ├── doctest_example.py
│   ├── testsuite.py
│   └── vendor/
│       └── jenkins/
│           └── xunit-plugin/
│               ├── 14c6e39c38408b9ed6280361484a13c6f5becca7/
│               │   └── junit-10.xsd
│               └── ae25da5089d4f94ac6c4669bf736e4d416cc4665/
│                   └── junit-10.xsd
├── tox.ini
└── xmlrunner/
    ├── __init__.py
    ├── __main__.py
    ├── builder.py
    ├── extra/
    │   ├── __init__.py
    │   ├── djangotestrunner.py
    │   └── xunit_plugin.py
    ├── result.py
    ├── runner.py
    └── unittest.py

================================================
FILE CONTENTS
================================================

================================================
FILE: .coveragerc
================================================
[report]
include =
  tests/*
  xmlrunner/*


================================================
FILE: .github/ISSUE_TEMPLATE/bug_report.md
================================================
<!--
Thank you for your contribution and taking the time to report the issue!
Note: Please search to see if an issue already exists for the bug you encountered.
-->

---
name: 🐞 Bug
about: File a bug/issue
title: '[BUG] <title>'
labels: Bug, Needs Triage
assignees: ''

---


### Current Behavior:
<!-- A concise description of what you're experiencing. -->

### Expected Behavior:
<!-- A concise description of what you expected to happen. -->

### Steps To Reproduce:
<!--
Example: steps to reproduce the behavior:
1. In this environment...
1. With this config...
1. Run '...'
1. See error...

or a minimal example
-->

### Environment:
<!--
Example:
- OS: Ubuntu 20.04
- Python: 3.14.0
- xmlrunner: 4.0.0
output of `pip list`
-->

### Anything else:
<!--
Links? References? Anything that will give us more context about the issue that you are encountering!
-->


================================================
FILE: .github/ISSUE_TEMPLATE/feature_request.md
================================================
<!--
Thank you for your contribution and taking the time to report the issue!
Note: Please search to see if an issue already exists for the bug you encountered.
-->

---
name: 🚀 Feature Request
about: File a feature request
title: '[Feature] <title>'
labels: Feature, Needs Triage
assignees: ''

---


### Current Behavior:
<!--
Explain why the current behavior has shortcomings.
-->

### Feature Behavior:
<!--
A detailed description of the feature.
Explain the pros/cons of this approach.
Explain the target audience for this feature,
e.g. is it for a very niche use case? or will everybody use it?
-->

### Test Plan for the feature
<!--
How to test the new feature and ensure it plays nice with other existing features.
How to verify that it works?
Ideas for unit tests?
-->

### Alternative solutions:
<!--
Explain how you are currently achieving the results a different way.
Explain the pros/cons of the alternative solutions.
-->

### Environment:
<!--
Example:
- OS: Ubuntu 20.04
- Python: 3.14.0
- xmlrunner: 4.0.0
output of `pip list`
-->

### Anything else:
<!--
Links? References?
Anything that will give us more context about the feature request!
-->


================================================
FILE: .github/pull_request_template.md
================================================
<!--
Thank you for submitting a PR and your contribution!

Quick checklist

- [ ] Include unit tests when applicable
- [ ] Documentation and comments
- [ ] Reference existing issues, e.g. `see issue #123`, `closes #123`
- [ ] Reference existing pull requests, e.g. `supersedes PR #123`

See the [github docs](https://help.github.com/en/github/managing-your-work-on-github/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword) for more information.

-->

================================================
FILE: .github/workflows/tests.yml
================================================
name: Tests

on:
  push:
    branches:
      - master
  pull_request:

jobs:
  build:
    runs-on: ubuntu-latest
    strategy:
      matrix:
        name: [
          "py310",
          "py311",
          "py312",
          "py313",
          "py314",
        ]
        include:
          - name: py310
            python-version: "3.10"
            tox_env: py310
          - name: py311
            python-version: "3.11"
            tox_env: py311
          - name: py312
            python-version: "3.12"
            tox_env: py312
          - name: py313
            python-version: "3.13"
            tox_env: py313
          - name: py314
            python-version: "3.14"
            tox_env: py314
    steps:
      - name: Checkout
        uses: actions/checkout@v2
      - name: Setup Python ${{ matrix.python-version }}
        uses: actions/setup-python@v2
        with:
          python-version: ${{ matrix.python-version }}
      - name: Before Install
        run: |
          python --version
          uname -a
          lsb_release -a
      - name: Install
        env:
          TOXENV: ${{ matrix.tox_env }}
        run: |
          pip install tox-gh-actions codecov coveralls
          pip --version
          tox --version
      - name: Script
        run: |
          tox run -v
      - name: After Failure
        if: ${{ failure() }}
        run: |
          more .tox/log/* | cat
          more .tox/*/log/* | cat
      - name: Upload to Codecov
        run: |
          tox -e uploadcodecov
        env:
          CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}


================================================
FILE: .gitignore
================================================
# Python bytecode
*.pyc

# Build directory
build/*
dist/*

# Egg info directory
*.egg-info

# tox + coverage
.tox
.coverage
htmlcov/

# autogenerated
xmlrunner/version.py


================================================
FILE: .landscape.yml
================================================
doc-warnings: true
test-warnings: false
strictness: veryhigh
max-line-length: 80
autodetect: true
python-targets:
  - 2
  - 3


================================================
FILE: LICENSE
================================================
Copyright (c) 2008-2013, Daniel Fernandes Martins
All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met: 

1. Redistributions of source code must retain the above copyright notice, this
   list of conditions and the following disclaimer. 
2. Redistributions in binary form must reproduce the above copyright notice,
   this list of conditions and the following disclaimer in the documentation
   and/or other materials provided with the distribution. 

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

The views and conclusions contained in the software and documentation are those
of the authors and should not be interpreted as representing official policies, 
either expressed or implied, of the FreeBSD Project.

================================================
FILE: MANIFEST.in
================================================
include README.md
include LICENSE

================================================
FILE: Makefile
================================================

build/tox/bin:
	python3 -m venv build/tox
	build/tox/bin/pip install tox

build/publish/bin:
	python3 -m venv build/publish
	build/publish/bin/pip install build wheel twine

checkversion:
	git log -1 --oneline | grep -q "Bump version" || (echo "DID NOT DO VERSION BUMP"; exit 1)
	git show-ref --tags | grep -q $$(git log -1 --pretty=%H) || (echo "DID NOT TAG VERSION"; exit 1)

dist: checkversion build/publish/bin
	build/publish/bin/python -m build

publish: dist/ build/publish/bin
	build/publish/bin/twine upload dist/*

test: build/tox/bin
	build/tox/bin/tox

clean:
	rm -rf build/ dist/

.PHONY: checkversion dist publish clean test


================================================
FILE: README.md
================================================
[![License](https://img.shields.io/pypi/l/unittest-xml-reporting.svg)](https://pypi.python.org/pypi/unittest-xml-reporting/)
[![Latest Version](https://img.shields.io/pypi/v/unittest-xml-reporting.svg)](https://pypi.python.org/pypi/unittest-xml-reporting/)
[![Development Status](https://img.shields.io/pypi/status/unittest-xml-reporting.svg)](https://pypi.python.org/pypi/unittest-xml-reporting/)
[![Documentation Status](https://readthedocs.org/projects/unittest-xml-reporting/badge/?version=latest)](http://unittest-xml-reporting.readthedocs.io/en/latest/?badge=latest)

[![codecov.io Coverage Status](https://codecov.io/github/xmlrunner/unittest-xml-reporting/coverage.svg?branch=master)](https://codecov.io/github/xmlrunner/unittest-xml-reporting?branch=master)
[![Coveralls Coverage Status](https://coveralls.io/repos/xmlrunner/unittest-xml-reporting/badge.svg?branch=master&service=github)](https://coveralls.io/github/xmlrunner/unittest-xml-reporting?branch=master)


# unittest-xml-reporting (aka xmlrunner)

A unittest test runner that can save test results to XML files in xUnit format.
The files can be consumed by a wide range of tools, such as build systems, IDEs
and continuous integration servers.


## PyPI

- PyPI page for the project: https://pypi.org/project/unittest-xml-reporting/
- PyPI download stats: https://pypistats.org/packages/unittest-xml-reporting


## Contributing

We are always looking for good contributions, so please just fork the
repository and send pull requests (with tests please!).

If you would like write access to the repository, or become a maintainer,
feel free to get in touch.


## Requirements

See [Status of Python versions](https://devguide.python.org/versions/) for python EOL information.

* Python 3.10+
* Last version supporting Python 3.7 - 3.9 was 3.2.0
* Please note Python 3.6 end-of-life was in Dec 2021, last version supporting 3.6 was 3.1.0
* Please note Python 3.5 end-of-life was in Sep 2020, last version supporting 3.5 was 3.1.0
* Please note Python 2.7 end-of-life was in Jan 2020, last version supporting 2.7 was 2.5.2
* Please note Python 3.4 end-of-life was in Mar 2019, last version supporting 3.4 was 2.5.2
* Please note Python 2.6 end-of-life was in Oct 2013, last version supporting 2.6 was 1.14.0


## Limited support for `unittest.TestCase.subTest`

https://docs.python.org/3/library/unittest.html#unittest.TestCase.subTest

`unittest` has the concept of sub-tests for a `unittest.TestCase`; this doesn't map well to an existing xUnit concept, so you won't find it in the schema. What that means, is that you lose some granularity
in the reports for sub-tests.

`unittest` also does not report successful sub-tests, so the accounting won't be exact.

## Jenkins plugins

- Jenkins JUnit plugin : https://plugins.jenkins.io/junit/
- Jenkins xUnit plugin : https://plugins.jenkins.io/xunit/

### Jenkins JUnit plugin

This plugin does not perform XSD validation (at time of writing) and should parse the XML file without issues.

### Jenkins xUnit plugin version 1.100

- [Jenkins (junit-10.xsd), xunit plugin (2014-2018)](https://github.com/jenkinsci/xunit-plugin/blob/14c6e39c38408b9ed6280361484a13c6f5becca7/src/main/resources/org/jenkinsci/plugins/xunit/types/model/xsd/junit-10.xsd), version `1.100`.

This plugin does perfom XSD validation and uses the more lax XSD. This should parse the XML file without issues.

### Jenkins xUnit plugin version 1.104+

- [Jenkins (junit-10.xsd), xunit plugin (2018-current)](https://github.com/jenkinsci/xunit-plugin/blob/ae25da5089d4f94ac6c4669bf736e4d416cc4665/src/main/resources/org/jenkinsci/plugins/xunit/types/model/xsd/junit-10.xsd), version `1.104`+.

This plugin does perfom XSD validation and uses the more strict XSD.

See https://github.com/xmlrunner/unittest-xml-reporting/issues/209

```
import io
import unittest
import xmlrunner

# run the tests storing results in memory
out = io.BytesIO()
unittest.main(
    testRunner=xmlrunner.XMLTestRunner(output=out),
    failfast=False, buffer=False, catchbreak=False, exit=False)
```

Transform the results removing extra attributes.
```
from xmlrunner.extra.xunit_plugin import transform

with open('TEST-report.xml', 'wb') as report:
    report.write(transform(out.getvalue()))

```

## JUnit Schema ?

There are many tools claiming to write JUnit reports, so you will find many schemas with minor differences.

We used the XSD that was available in the Jenkins xUnit plugin version `1.100`; a copy is available under `tests/vendor/jenkins/xunit-plugin/.../junit-10.xsd` (see attached license).

You may also find these resources useful:

- https://stackoverflow.com/questions/4922867/what-is-the-junit-xml-format-specification-that-hudson-supports
- https://stackoverflow.com/questions/11241781/python-unittests-in-jenkins
- [JUnit-Schema (JUnit.xsd)](https://github.com/windyroad/JUnit-Schema/blob/master/JUnit.xsd)
- [Windyroad (JUnit.xsd)](http://windyroad.com.au/dl/Open%20Source/JUnit.xsd)
- [a gist (Jenkins xUnit test result schema)](https://gist.github.com/erikd/4192748)


## Installation

The easiest way to install unittest-xml-reporting is via
[Pip](http://www.pip-installer.org):

````bash
$ pip install unittest-xml-reporting
````

If you use Git and want to get the latest *development* version:

````bash
$ git clone https://github.com/xmlrunner/unittest-xml-reporting.git
$ cd unittest-xml-reporting
$ sudo python -m pip install .
````

Or get the latest *development* version as a tarball:

````bash
$ wget https://github.com/xmlrunner/unittest-xml-reporting/archive/master.zip
$ unzip master.zip
$ cd unittest-xml-reporting
$ sudo python -m pip install .
````

Or you can manually download the latest released version from
[PyPI](https://pypi.python.org/pypi/unittest-xml-reporting/).


## Command-line

````bash
python -m xmlrunner [options]
python -m xmlrunner discover [options]

# help
python -m xmlrunner -h
````

e.g. 
````bash
python -m xmlrunner discover -t ~/mycode/tests -o /tmp/build/junit-reports
````

## Usage

The script below, adapted from the
[unittest](http://docs.python.org/library/unittest.html), shows how to use
`XMLTestRunner` in a very simple way. In fact, the only difference between
this script and the original one is the last line:

````python
import random
import unittest
import xmlrunner

class TestSequenceFunctions(unittest.TestCase):

    def setUp(self):
        self.seq = list(range(10))

    @unittest.skip("demonstrating skipping")
    def test_skipped(self):
        self.fail("shouldn't happen")

    def test_shuffle(self):
        # make sure the shuffled sequence does not lose any elements
        random.shuffle(self.seq)
        self.seq.sort()
        self.assertEqual(self.seq, list(range(10)))

        # should raise an exception for an immutable sequence
        self.assertRaises(TypeError, random.shuffle, (1,2,3))

    def test_choice(self):
        element = random.choice(self.seq)
        self.assertTrue(element in self.seq)

    def test_sample(self):
        with self.assertRaises(ValueError):
            random.sample(self.seq, 20)
        for element in random.sample(self.seq, 5):
            self.assertTrue(element in self.seq)

if __name__ == '__main__':
    unittest.main(
        testRunner=xmlrunner.XMLTestRunner(output='test-reports'),
        # these make sure that some options that are not applicable
        # remain hidden from the help menu.
        failfast=False, buffer=False, catchbreak=False)
````

### Reporting to a single file

````python
if __name__ == '__main__':
    with open('/path/to/results.xml', 'wb') as output:
        unittest.main(
            testRunner=xmlrunner.XMLTestRunner(output=output),
            failfast=False, buffer=False, catchbreak=False)
````

### Doctest support

The XMLTestRunner can also be used to report on docstrings style tests.

````python
import doctest
import xmlrunner

def twice(n):
    """
    >>> twice(5)
    10
    """
    return 2 * n

class Multiplicator(object):
    def threetimes(self, n):
        """
        >>> Multiplicator().threetimes(5)
        15
        """
        return 3 * n

if __name__ == "__main__":
    suite = doctest.DocTestSuite()
    xmlrunner.XMLTestRunner().run(suite)
````

### Django support

In order to plug `XMLTestRunner` to a Django project, add the following
to your `settings.py`:

````python
TEST_RUNNER = 'xmlrunner.extra.djangotestrunner.XMLTestRunner'
````

Also, the following settings are provided so you can fine tune the reports:

|setting|default|values|description|
|-|-|-|-|
|`TEST_OUTPUT_VERBOSE`|`1`|`0\|1\|2`|Besides the XML reports generated by the test runner, a bunch of useful information is printed to the `sys.stderr` stream, just like the `TextTestRunner` does. Use this setting to choose between a verbose and a non-verbose output.|
|`TEST_OUTPUT_DESCRIPTIONS`|`False`|`True\|False`|If your test methods contains docstrings, you can display such docstrings instead of display the test name (ex: `module.TestCase.test_method`).<br>In order to use this feature, you have to enable verbose output by setting `TEST_OUTPUT_VERBOSE = 2`.<br>Only effects stdout and not XML output.|
|`TEST_OUTPUT_DIR`|`"."`|`<str>`|Tells the test runner where to put the XML reports. If the directory couldn't be found, the test runner will try to create it before generate the XML files.|
|`TEST_OUTPUT_FILE_NAME`|`None`|`<str>`|Tells the test runner to output a single XML report with this filename under `os.path.join(TEST_OUTPUT_DIR, TEST_OUTPUT_FILE_NAME)`.<br>Please note that for long running tests, this will keep the results in memory for a longer time than multiple reports, and may use up more resources.|


### Testing changes with `tox`

Please use `tox` to test your changes before sending a pull request.
You can find more information about `tox` at <https://testrun.org/tox/latest/>.

```bash
$ pip install tox

# basic sanity test, friendly output
$ tox -e pytest

# all combinations
$ tox
```


================================================
FILE: docs/Makefile
================================================
# Makefile for Sphinx documentation
#

# You can set these variables from the command line.
SPHINXOPTS    =
SPHINXBUILD   = sphinx-build
PAPER         =
BUILDDIR      = _build

# Internal variables.
PAPEROPT_a4     = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS   = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS  = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .

.PHONY: help
help:
	@echo "Please use \`make <target>' where <target> is one of"
	@echo "  html       to make standalone HTML files"
	@echo "  dirhtml    to make HTML files named index.html in directories"
	@echo "  singlehtml to make a single large HTML file"
	@echo "  pickle     to make pickle files"
	@echo "  json       to make JSON files"
	@echo "  htmlhelp   to make HTML files and a HTML help project"
	@echo "  qthelp     to make HTML files and a qthelp project"
	@echo "  applehelp  to make an Apple Help Book"
	@echo "  devhelp    to make HTML files and a Devhelp project"
	@echo "  epub       to make an epub"
	@echo "  epub3      to make an epub3"
	@echo "  latex      to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
	@echo "  latexpdf   to make LaTeX files and run them through pdflatex"
	@echo "  latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
	@echo "  text       to make text files"
	@echo "  man        to make manual pages"
	@echo "  texinfo    to make Texinfo files"
	@echo "  info       to make Texinfo files and run them through makeinfo"
	@echo "  gettext    to make PO message catalogs"
	@echo "  changes    to make an overview of all changed/added/deprecated items"
	@echo "  xml        to make Docutils-native XML files"
	@echo "  pseudoxml  to make pseudoxml-XML files for display purposes"
	@echo "  linkcheck  to check all external links for integrity"
	@echo "  doctest    to run all doctests embedded in the documentation (if enabled)"
	@echo "  coverage   to run coverage check of the documentation (if enabled)"
	@echo "  dummy      to check syntax errors of document sources"

.PHONY: clean
clean:
	rm -rf $(BUILDDIR)/*

.PHONY: html
html:
	$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
	@echo
	@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."

.PHONY: dirhtml
dirhtml:
	$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
	@echo
	@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."

.PHONY: singlehtml
singlehtml:
	$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
	@echo
	@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."

.PHONY: pickle
pickle:
	$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
	@echo
	@echo "Build finished; now you can process the pickle files."

.PHONY: json
json:
	$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
	@echo
	@echo "Build finished; now you can process the JSON files."

.PHONY: htmlhelp
htmlhelp:
	$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
	@echo
	@echo "Build finished; now you can run HTML Help Workshop with the" \
	      ".hhp project file in $(BUILDDIR)/htmlhelp."

.PHONY: qthelp
qthelp:
	$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
	@echo
	@echo "Build finished; now you can run "qcollectiongenerator" with the" \
	      ".qhcp project file in $(BUILDDIR)/qthelp, like this:"
	@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/unittest-xml-reporting.qhcp"
	@echo "To view the help file:"
	@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/unittest-xml-reporting.qhc"

.PHONY: applehelp
applehelp:
	$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
	@echo
	@echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
	@echo "N.B. You won't be able to view it unless you put it in" \
	      "~/Library/Documentation/Help or install it in your application" \
	      "bundle."

.PHONY: devhelp
devhelp:
	$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
	@echo
	@echo "Build finished."
	@echo "To view the help file:"
	@echo "# mkdir -p $$HOME/.local/share/devhelp/unittest-xml-reporting"
	@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/unittest-xml-reporting"
	@echo "# devhelp"

.PHONY: epub
epub:
	$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
	@echo
	@echo "Build finished. The epub file is in $(BUILDDIR)/epub."

.PHONY: epub3
epub3:
	$(SPHINXBUILD) -b epub3 $(ALLSPHINXOPTS) $(BUILDDIR)/epub3
	@echo
	@echo "Build finished. The epub3 file is in $(BUILDDIR)/epub3."

.PHONY: latex
latex:
	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
	@echo
	@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
	@echo "Run \`make' in that directory to run these through (pdf)latex" \
	      "(use \`make latexpdf' here to do that automatically)."

.PHONY: latexpdf
latexpdf:
	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
	@echo "Running LaTeX files through pdflatex..."
	$(MAKE) -C $(BUILDDIR)/latex all-pdf
	@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."

.PHONY: latexpdfja
latexpdfja:
	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
	@echo "Running LaTeX files through platex and dvipdfmx..."
	$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
	@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."

.PHONY: text
text:
	$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
	@echo
	@echo "Build finished. The text files are in $(BUILDDIR)/text."

.PHONY: man
man:
	$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
	@echo
	@echo "Build finished. The manual pages are in $(BUILDDIR)/man."

.PHONY: texinfo
texinfo:
	$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
	@echo
	@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
	@echo "Run \`make' in that directory to run these through makeinfo" \
	      "(use \`make info' here to do that automatically)."

.PHONY: info
info:
	$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
	@echo "Running Texinfo files through makeinfo..."
	make -C $(BUILDDIR)/texinfo info
	@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."

.PHONY: gettext
gettext:
	$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
	@echo
	@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."

.PHONY: changes
changes:
	$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
	@echo
	@echo "The overview file is in $(BUILDDIR)/changes."

.PHONY: linkcheck
linkcheck:
	$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
	@echo
	@echo "Link check complete; look for any errors in the above output " \
	      "or in $(BUILDDIR)/linkcheck/output.txt."

.PHONY: doctest
doctest:
	$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
	@echo "Testing of doctests in the sources finished, look at the " \
	      "results in $(BUILDDIR)/doctest/output.txt."

.PHONY: coverage
coverage:
	$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
	@echo "Testing of coverage in the sources finished, look at the " \
	      "results in $(BUILDDIR)/coverage/python.txt."

.PHONY: xml
xml:
	$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
	@echo
	@echo "Build finished. The XML files are in $(BUILDDIR)/xml."

.PHONY: pseudoxml
pseudoxml:
	$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
	@echo
	@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."

.PHONY: dummy
dummy:
	$(SPHINXBUILD) -b dummy $(ALLSPHINXOPTS) $(BUILDDIR)/dummy
	@echo
	@echo "Build finished. Dummy builder generates no files."


================================================
FILE: docs/conf.py
================================================
# -*- coding: utf-8 -*-
#
# unittest-xml-reporting documentation build configuration file, created by
# sphinx-quickstart on Mon May 30 11:39:40 2016.
#
# This file is execfile()d with the current directory set to its
# containing dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.

# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))

# -- General configuration ------------------------------------------------

# If your documentation needs a minimal Sphinx version, state it here.
#
# needs_sphinx = '1.0'

# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
    'sphinx.ext.autodoc',
    'sphinx.ext.doctest',
]

# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']

# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
#
# source_suffix = ['.rst', '.md']
source_suffix = ['.rst', '.md']

# The encoding of source files.
#
# source_encoding = 'utf-8-sig'

# The master toctree document.
master_doc = 'index'

# General information about the project.
project = u'unittest-xml-reporting'
copyright = u'2016, Daniel Fernandes Martins, Damien Nozay'
author = u'Daniel Fernandes Martins, Damien Nozay'

# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
version = u'2.1.0'
# The full version, including alpha/beta/rc tags.
release = u'2.1.0'

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None

# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
#
# today = ''
#
# Else, today_fmt is used as the format for a strftime call.
#
# today_fmt = '%B %d, %Y'

# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This patterns also effect to html_static_path and html_extra_path
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']

# The reST default role (used for this markup: `text`) to use for all
# documents.
#
# default_role = None

# If true, '()' will be appended to :func: etc. cross-reference text.
#
# add_function_parentheses = True

# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
#
# add_module_names = True

# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
#
# show_authors = False

# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'

# A list of ignored prefixes for module index sorting.
# modindex_common_prefix = []

# If true, keep warnings as "system message" paragraphs in the built documents.
# keep_warnings = False

# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = False


# -- Options for HTML output ----------------------------------------------

# The theme to use for HTML and HTML Help pages.  See the documentation for
# a list of builtin themes.
#
html_theme = 'alabaster'

# Theme options are theme-specific and customize the look and feel of a theme
# further.  For a list of options available for each theme, see the
# documentation.
#
# html_theme_options = {}

# Add any paths that contain custom themes here, relative to this directory.
# html_theme_path = []

# The name for this set of Sphinx documents.
# "<project> v<release> documentation" by default.
#
# html_title = u'unittest-xml-reporting v2.1.0'

# A shorter title for the navigation bar.  Default is the same as html_title.
#
# html_short_title = None

# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
#
# html_logo = None

# The name of an image file (relative to this directory) to use as a favicon of
# the docs.  This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
#
# html_favicon = None

# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']

# Add any extra paths that contain custom files (such as robots.txt or
# .htaccess) here, relative to this directory. These files are copied
# directly to the root of the documentation.
#
# html_extra_path = []

# If not None, a 'Last updated on:' timestamp is inserted at every page
# bottom, using the given strftime format.
# The empty string is equivalent to '%b %d, %Y'.
#
# html_last_updated_fmt = None

# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
#
# html_use_smartypants = True

# Custom sidebar templates, maps document names to template names.
#
# html_sidebars = {}

# Additional templates that should be rendered to pages, maps page names to
# template names.
#
# html_additional_pages = {}

# If false, no module index is generated.
#
# html_domain_indices = True

# If false, no index is generated.
#
# html_use_index = True

# If true, the index is split into individual pages for each letter.
#
# html_split_index = False

# If true, links to the reST sources are added to the pages.
#
# html_show_sourcelink = True

# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
#
# html_show_sphinx = True

# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
#
# html_show_copyright = True

# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it.  The value of this option must be the
# base URL from which the finished HTML is served.
#
# html_use_opensearch = ''

# This is the file name suffix for HTML files (e.g. ".xhtml").
# html_file_suffix = None

# Language to be used for generating the HTML full-text search index.
# Sphinx supports the following languages:
#   'da', 'de', 'en', 'es', 'fi', 'fr', 'hu', 'it', 'ja'
#   'nl', 'no', 'pt', 'ro', 'ru', 'sv', 'tr', 'zh'
#
# html_search_language = 'en'

# A dictionary with options for the search language support, empty by default.
# 'ja' uses this config value.
# 'zh' user can custom change `jieba` dictionary path.
#
# html_search_options = {'type': 'default'}

# The name of a javascript file (relative to the configuration directory) that
# implements a search results scorer. If empty, the default will be used.
#
# html_search_scorer = 'scorer.js'

# Output file base name for HTML help builder.
htmlhelp_basename = 'unittest-xml-reportingdoc'

# -- Options for LaTeX output ---------------------------------------------

latex_elements = {
     # The paper size ('letterpaper' or 'a4paper').
     #
     # 'papersize': 'letterpaper',

     # The font size ('10pt', '11pt' or '12pt').
     #
     # 'pointsize': '10pt',

     # Additional stuff for the LaTeX preamble.
     #
     # 'preamble': '',

     # Latex figure (float) alignment
     #
     # 'figure_align': 'htbp',
}

# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
#  author, documentclass [howto, manual, or own class]).
latex_documents = [
    (master_doc, 'unittest-xml-reporting.tex', u'unittest-xml-reporting Documentation',
     u'Daniel Fernandes Martins, Damien Nozay', 'manual'),
]

# The name of an image file (relative to this directory) to place at the top of
# the title page.
#
# latex_logo = None

# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
#
# latex_use_parts = False

# If true, show page references after internal links.
#
# latex_show_pagerefs = False

# If true, show URL addresses after external links.
#
# latex_show_urls = False

# Documents to append as an appendix to all manuals.
#
# latex_appendices = []

# If false, no module index is generated.
#
# latex_domain_indices = True


# -- Options for manual page output ---------------------------------------

# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
    (master_doc, 'unittest-xml-reporting', u'unittest-xml-reporting Documentation',
     [author], 1)
]

# If true, show URL addresses after external links.
#
# man_show_urls = False


# -- Options for Texinfo output -------------------------------------------

# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
#  dir menu entry, description, category)
texinfo_documents = [
    (master_doc, 'unittest-xml-reporting', u'unittest-xml-reporting Documentation',
     author, 'unittest-xml-reporting', 'One line description of project.',
     'Miscellaneous'),
]

# Documents to append as an appendix to all manuals.
#
# texinfo_appendices = []

# If false, no module index is generated.
#
# texinfo_domain_indices = True

# How to display URL addresses: 'footnote', 'no', or 'inline'.
#
# texinfo_show_urls = 'footnote'

# If true, do not generate a @detailmenu in the "Top" node's menu.
#
# texinfo_no_detailmenu = False


================================================
FILE: docs/index.rst
================================================
unittest-xml-reporting
======================

``unittest-xml-reporting`` is a ``unittest`` test runner that can save
test results to XML files (jUnit) and be consumed by a wide range of
tools such as continuous integration systems.


Getting started
===============

Similar to the ``unittest`` module, you can run::

    python -m xmlrunner test_module
    python -m xmlrunner module.TestClass
    python -m xmlrunner module.Class.test_method

as well as::

    python -m xmlrunner discover [options]

You can also add a top level file to allow running the tests with
the command ``python tests.py``, and configure the test runner
to output the XML reports in the ``test-reports`` directory. ::

    # tests.py

    if __name__ == '__main__':
        unittest.main(
            testRunner=xmlrunner.XMLTestRunner(output='test-reports'),
            # these make sure that some options that are not applicable
            # remain hidden from the help menu.
            failfast=False, buffer=False, catchbreak=False)



Indices and tables
==================

* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`



================================================
FILE: pyproject.toml
================================================
[build-system]
build-backend = "setuptools.build_meta"
requires = [
    "setuptools>=77",
    "setuptools-scm[toml]>=6.2.3",
]


[project]
name = "unittest-xml-reporting"
license = "BSD-2-Clause"
license-files = [ "LICENSE" ]
dynamic = [
    "version",
]
description = "unittest-based test runner with Ant/JUnit like XML reporting."
keywords = [
    "pyunit",
    "unittest",
    "junit xml",
    "xunit",
    "report",
    "testrunner",
    "xmlrunner",
]
readme = "README.md"
authors = [
    {name = "Daniel Fernandes Martins"},
    {name = "Damien Nozay"},
]
classifiers = [
    "Development Status :: 5 - Production/Stable",
    "Intended Audience :: Developers",
    "Natural Language :: English",
    "Operating System :: OS Independent",
    "Programming Language :: Python :: 3 :: Only",
    "Programming Language :: Python :: 3.10",
    "Programming Language :: Python :: 3.11",
    "Programming Language :: Python :: 3.12",
    "Programming Language :: Python :: 3.13",
    "Programming Language :: Python :: 3.14",
    "Programming Language :: Python :: Implementation :: CPython",
    "Programming Language :: Python :: Implementation :: PyPy",
    "Topic :: Software Development :: Libraries :: Python Modules",
    "Topic :: Software Development :: Testing",
]
requires-python = ">=3.10"
dependencies = [
    "lxml",
]

[project.urls]
Homepage = "http://github.com/xmlrunner/unittest-xml-reporting/tree/master/"


[tool.setuptools_scm]
write_to = "xmlrunner/version.py"

[tool.distutils.bdist_wheel]
universal = 1
python-tag = "py3"


================================================
FILE: tests/__init__.py
================================================


================================================
FILE: tests/builder_test.py
================================================
# -*- coding: utf-8

from xmlrunner.unittest import unittest

import xml.etree.ElementTree as ET
from xml.dom.minidom import Document

from xmlrunner import builder


class TestXMLContextTest(unittest.TestCase):
    """TestXMLContext test cases.
    """

    def setUp(self):
        self.doc = Document()
        self.root = builder.TestXMLContext(self.doc)

    def test_current_element_tag_name(self):
        self.root.begin('tag', 'context-name')
        self.assertEqual(self.root.element_tag(), 'tag')

    def test_current_context_name(self):
        self.root.begin('tag', 'context-name')
        name = self.root.element.getAttribute('name')
        self.assertEqual(name, 'context-name')

    def test_current_context_invalid_unicode_name(self):
        self.root.begin('tag', u'context-name\x01\x0B')
        name = self.root.element.getAttribute('name')
        self.assertEqual(name, u'context-name\uFFFD\uFFFD')

    def test_increment_valid_testsuites_counters(self):
        self.root.begin('testsuites', 'name')

        for c in ('tests', 'failures', 'errors', 'skipped'):
            self.root.increment_counter(c)

        element = self.root.end()

        with self.assertRaises(KeyError):
            element.attributes['skipped']

        for c in ('tests', 'failures', 'errors'):
            value = element.attributes[c].value
            self.assertEqual(value, '1')

    def test_increment_valid_testsuite_counters(self):
        self.root.begin('testsuite', 'name')

        for c in ('tests', 'failures', 'errors', 'skipped'):
            self.root.increment_counter(c)

        element = self.root.end()

        for c in ('tests', 'failures', 'errors', 'skipped'):
            value = element.attributes[c].value
            self.assertEqual(value, '1')

    def test_increment_counters_for_unknown_context(self):
        self.root.begin('unknown', 'name')

        for c in ('tests', 'failures', 'errors', 'skipped', 'invalid'):
            self.root.increment_counter(c)

        element = self.root.end()

        for c in ('tests', 'failures', 'errors', 'skipped', 'invalid'):
            with self.assertRaises(KeyError):
                element.attributes[c]

    def test_empty_counters_on_end_context(self):
        self.root.begin('testsuite', 'name')
        element = self.root.end()

        for c in ('tests', 'failures', 'errors', 'skipped'):
            self.assertEqual(element.attributes[c].value, '0')

    def test_add_time_attribute_on_end_context(self):
        self.root.begin('testsuite', 'name')
        element = self.root.end()

        element.attributes['time'].value

    def test_add_timestamp_attribute_on_end_context(self):
        self.root.begin('testsuite', 'name')
        element = self.root.end()

        element.attributes['timestamp'].value


class TestXMLBuilderTest(unittest.TestCase):
    """TestXMLBuilder test cases.
    """

    def setUp(self):
        self.builder = builder.TestXMLBuilder()
        self.doc = self.builder._xml_doc
        self.builder.begin_context('testsuites', 'name')

        self.valid_chars = u'выбор'

        self.invalid_chars = '\x01'
        self.invalid_chars_replace = u'\ufffd'

    def test_root_has_no_parent(self):
        self.assertIsNone(self.builder.current_context().parent)

    def test_current_context_tag(self):
        self.assertEqual(self.builder.context_tag(), 'testsuites')

    def test_begin_nested_context(self):
        root = self.builder.current_context()

        self.builder.begin_context('testsuite', 'name')

        self.assertEqual(self.builder.context_tag(), 'testsuite')
        self.assertIs(self.builder.current_context().parent, root)

    def test_end_inexistent_context(self):
        self.builder = builder.TestXMLBuilder()

        self.assertFalse(self.builder.end_context())
        self.assertEqual(len(self.doc.childNodes), 0)

    def test_end_root_context(self):
        root = self.builder.current_context()

        self.assertTrue(self.builder.end_context())
        self.assertIsNone(self.builder.current_context())

        # No contexts left
        self.assertFalse(self.builder.end_context())

        doc_children = self.doc.childNodes

        self.assertEqual(len(doc_children), 1)
        self.assertEqual(len(doc_children[0].childNodes), 0)
        self.assertEqual(doc_children[0].tagName, root.element_tag())

    def test_end_nested_context(self):
        self.builder.begin_context('testsuite', 'name')
        self.builder.current_context()

        self.assertTrue(self.builder.end_context())

        # Only updates the document when all contexts end
        self.assertEqual(len(self.doc.childNodes), 0)

    def test_end_all_context_stack(self):
        root = self.builder.current_context()

        self.builder.begin_context('testsuite', 'name')
        nested = self.builder.current_context()

        self.assertTrue(self.builder.end_context())
        self.assertTrue(self.builder.end_context())

        # No contexts left
        self.assertFalse(self.builder.end_context())

        root_child = self.doc.childNodes

        self.assertEqual(len(root_child), 1)
        self.assertEqual(root_child[0].tagName, root.element_tag())

        nested_child = root_child[0].childNodes

        self.assertEqual(len(nested_child), 1)
        self.assertEqual(nested_child[0].tagName, nested.element_tag())

    def test_append_valid_unicode_cdata_section(self):
        self.builder.append_cdata_section('tag', self.valid_chars)
        self.builder.end_context()

        root_child = self.doc.childNodes[0]

        cdata_container = root_child.childNodes[0]
        self.assertEqual(cdata_container.tagName, 'tag')

        cdata = cdata_container.childNodes[0]
        self.assertEqual(cdata.data, self.valid_chars)

    def test_append_invalid_unicode_cdata_section(self):
        self.builder.append_cdata_section('tag', self.invalid_chars)
        self.builder.end_context()

        root_child = self.doc.childNodes[0]
        cdata_container = root_child.childNodes[0]

        cdata = cdata_container.childNodes[0]
        self.assertEqual(cdata.data, self.invalid_chars_replace)

    def test_append_cdata_closing_tags_into_cdata_section(self):
        self.builder.append_cdata_section('tag', ']]>')
        self.builder.end_context()
        root_child = self.doc.childNodes[0]
        cdata_container = root_child.childNodes[0]
        self.assertEqual(len(cdata_container.childNodes), 2)
        self.assertEqual(cdata_container.childNodes[0].data, ']]')
        self.assertEqual(cdata_container.childNodes[1].data, '>')

    def test_append_tag_with_valid_unicode_values(self):
        self.builder.append('tag', self.valid_chars, attr=self.valid_chars)
        self.builder.end_context()

        root_child = self.doc.childNodes[0]
        tag = root_child.childNodes[0]

        self.assertEqual(tag.tagName, 'tag')
        self.assertEqual(tag.getAttribute('attr'), self.valid_chars)
        self.assertEqual(tag.childNodes[0].data, self.valid_chars)

    def test_append_tag_with_invalid_unicode_values(self):
        self.builder.append('tag', self.invalid_chars, attr=self.invalid_chars)
        self.builder.end_context()

        root_child = self.doc.childNodes[0]
        tag = root_child.childNodes[0]

        self.assertEqual(tag.tagName, 'tag')
        self.assertEqual(tag.getAttribute('attr'), self.invalid_chars_replace)
        self.assertEqual(tag.childNodes[0].data, self.invalid_chars_replace)

    def test_increment_root_context_counter(self):
        self.builder.increment_counter('tests')
        self.builder.end_context()

        root_child = self.doc.childNodes[0]

        self.assertEqual(root_child.tagName, 'testsuites')
        self.assertEqual(root_child.getAttribute('tests'), '1')

    def test_increment_nested_context_counter(self):
        self.builder.increment_counter('tests')

        self.builder.begin_context('testsuite', 'name')
        self.builder.increment_counter('tests')

        self.builder.end_context()
        self.builder.end_context()

        root_child = self.doc.childNodes[0]
        nested_child = root_child.childNodes[0]

        self.assertEqual(root_child.tagName, 'testsuites')
        self.assertEqual(nested_child.getAttribute('tests'), '1')
        self.assertEqual(root_child.getAttribute('tests'), '2')

    def test_finish_nested_context(self):
        self.builder.begin_context('testsuite', 'name')

        tree = ET.fromstring(self.builder.finish())

        self.assertEqual(tree.tag, 'testsuites')
        self.assertEqual(len(tree.findall("./testsuite")), 1)


================================================
FILE: tests/discovery_test.py
================================================
import unittest


class DiscoveryTest(unittest.TestCase):
    def test_discovery_pass(self):
        pass


================================================
FILE: tests/django_example/app/__init__.py
================================================


================================================
FILE: tests/django_example/app/admin.py
================================================
from django.contrib import admin  # NOQA

# Register your models here.


================================================
FILE: tests/django_example/app/migrations/__init__.py
================================================


================================================
FILE: tests/django_example/app/models.py
================================================
from django.db import models  # NOQA

# Create your models here.


================================================
FILE: tests/django_example/app/tests.py
================================================
from django.test import TestCase


# Create your tests here.
class DummyTestCase(TestCase):
    def test_pass(self):
        """Test Pass"""
        pass

    def test_negative_comment1(self):
        """Use a close comment XML tag -->"""
        pass

    def test_negative_comment2(self):
        """Check XML tag </testsuites>"""
        pass


================================================
FILE: tests/django_example/app/views.py
================================================
from django.shortcuts import render  # NOQA

# Create your views here.


================================================
FILE: tests/django_example/app2/__init__.py
================================================


================================================
FILE: tests/django_example/app2/admin.py
================================================
from django.contrib import admin  # NOQA

# Register your models here.


================================================
FILE: tests/django_example/app2/migrations/__init__.py
================================================


================================================
FILE: tests/django_example/app2/models.py
================================================
from django.db import models  # NOQA

# Create your models here.


================================================
FILE: tests/django_example/app2/tests.py
================================================
from django.test import TestCase


# Create your tests here.
class DummyTestCase(TestCase):
    def test_pass(self):
        pass


================================================
FILE: tests/django_example/app2/views.py
================================================
from django.shortcuts import render  # NOQA

# Create your views here.


================================================
FILE: tests/django_example/example/__init__.py
================================================


================================================
FILE: tests/django_example/example/settings.py
================================================

import os


BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
SECRET_KEY = 'not-a-secret'
DEBUG = True
ALLOWED_HOSTS = []
INSTALLED_APPS = ['app', 'app2']
MIDDLEWARE_CLASSES = []
ROOT_URLCONF = 'example.urls'
TEMPLATES = []
DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.sqlite3',
        'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
    }
}
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
STATIC_URL = '/static/'

# The settings we care about for xmlrunner.
# They are commented out because we will use settings.configure() in tests.

# TEST_RUNNER = 'xmlrunner.extra.djangotestrunner.XMLTestRunner'
# TEST_OUTPUT_FILE_NAME = 'results.xml'
# TEST_OUTPUT_VERBOSE = 2


================================================
FILE: tests/django_example/example/urls.py
================================================

from django.conf.urls import url
from django.contrib import admin


urlpatterns = [
    url(r'^admin/', admin.site.urls),
]


================================================
FILE: tests/django_example/example/wsgi.py
================================================

import os
from django.core.wsgi import get_wsgi_application


os.environ.setdefault("DJANGO_SETTINGS_MODULE", "example.settings")
application = get_wsgi_application()


================================================
FILE: tests/django_example/manage.py
================================================
#!/usr/bin/env python
import os
import sys


if __name__ == "__main__":
    os.environ.setdefault("DJANGO_SETTINGS_MODULE", "example.settings")
    from django.core.management import execute_from_command_line
    execute_from_command_line(sys.argv)


================================================
FILE: tests/django_test.py
================================================
from xmlrunner.unittest import unittest

import sys
import os
from os import path
import glob
from unittest import mock
import tempfile
import shutil

try:
    import django
except ImportError:
    django = None
else:
    from django.test.utils import get_runner
    from django.conf import settings, UserSettingsHolder
    from django.apps import apps
    settings.configure(DEBUG=True)


TESTS_DIR = path.dirname(__file__)


@unittest.skipIf(django is None, 'django not found')
class DjangoTest(unittest.TestCase):

    def setUp(self):
        self._old_cwd = os.getcwd()
        self.project_dir = path.abspath(path.join(TESTS_DIR, 'django_example'))
        self.tmpdir = tempfile.mkdtemp()
        os.chdir(self.project_dir)
        sys.path.append(self.project_dir)
        # allow changing settings
        self.old_settings = settings._wrapped
        os.environ['DJANGO_SETTINGS_MODULE'] = 'example.settings'
        settings.INSTALLED_APPS  # load settings on first access
        settings.DATABASES['default'] = {}
        settings.DATABASES['default']['NAME'] = path.join(
            self.tmpdir, 'db.sqlite3')
        # this goes around the "settings already loaded" issue.
        self.override = UserSettingsHolder(settings._wrapped)
        settings._wrapped = self.override

    def tearDown(self):
        os.chdir(self._old_cwd)
        shutil.rmtree(self.tmpdir)
        settings._wrapped = self.old_settings

    def _override_settings(self, **kwargs):
        # see django.test.utils.override_settings
        for key, new_value in kwargs.items():
            setattr(self.override, key, new_value)

    def _check_runner(self, runner):
        suite = runner.build_suite(test_labels=['app2', 'app'])
        test_ids = [test.id() for test in suite]
        self.assertEqual(test_ids, [
            'app2.tests.DummyTestCase.test_pass',
            'app.tests.DummyTestCase.test_negative_comment1',
            'app.tests.DummyTestCase.test_negative_comment2',
            'app.tests.DummyTestCase.test_pass',
        ])
        suite = runner.build_suite(test_labels=[])
        test_ids = [test.id() for test in suite]
        self.assertEqual(set(test_ids), set([
            'app.tests.DummyTestCase.test_pass',
            'app.tests.DummyTestCase.test_negative_comment1',
            'app.tests.DummyTestCase.test_negative_comment2',
            'app2.tests.DummyTestCase.test_pass',
        ]))

    def test_django_runner(self):
        runner_class = get_runner(settings)
        runner = runner_class()
        self._check_runner(runner)

    def test_django_xmlrunner(self):
        self._override_settings(
            TEST_RUNNER='xmlrunner.extra.djangotestrunner.XMLTestRunner')
        runner_class = get_runner(settings)
        runner = runner_class()
        self._check_runner(runner)

    def test_django_verbose(self):
        self._override_settings(
            TEST_OUTPUT_VERBOSE=True,
            TEST_RUNNER='xmlrunner.extra.djangotestrunner.XMLTestRunner')
        runner_class = get_runner(settings)
        runner = runner_class()
        self._check_runner(runner)

    def test_django_single_report(self):
        self._override_settings(
            TEST_OUTPUT_DIR=self.tmpdir,
            TEST_OUTPUT_FILE_NAME='results.xml',
            TEST_OUTPUT_VERBOSE=0,
            TEST_RUNNER='xmlrunner.extra.djangotestrunner.XMLTestRunner')
        apps.populate(settings.INSTALLED_APPS)
        runner_class = get_runner(settings)
        runner = runner_class()
        suite = runner.build_suite()
        runner.run_suite(suite)
        expected_file = path.join(self.tmpdir, 'results.xml')
        self.assertTrue(path.exists(expected_file),
                        'did not generate xml report where expected.')

    def test_django_single_report_create_folder(self):
        intermediate_directory = 'report'
        directory = path.join(self.tmpdir, intermediate_directory)
        self._override_settings(
            TEST_OUTPUT_DIR=directory,
            TEST_OUTPUT_FILE_NAME='results.xml',
            TEST_OUTPUT_VERBOSE=0,
            TEST_RUNNER='xmlrunner.extra.djangotestrunner.XMLTestRunner')
        apps.populate(settings.INSTALLED_APPS)
        runner_class = get_runner(settings)
        runner = runner_class()
        suite = runner.build_suite()
        runner.run_suite(suite)
        expected_file = path.join(directory, 'results.xml')
        self.assertTrue(path.exists(expected_file),
                        'did not generate xml report where expected.')

    def test_django_multiple_reports(self):
        self._override_settings(
            TEST_OUTPUT_DIR=self.tmpdir,
            TEST_OUTPUT_VERBOSE=0,
            TEST_RUNNER='xmlrunner.extra.djangotestrunner.XMLTestRunner')
        apps.populate(settings.INSTALLED_APPS)
        runner_class = get_runner(settings)
        runner = runner_class()
        suite = runner.build_suite(test_labels=None)
        runner.run_suite(suite)
        test_files = glob.glob(path.join(self.tmpdir, 'TEST*.xml'))
        self.assertTrue(test_files,
                        'did not generate xml reports where expected.')
        self.assertEqual(2, len(test_files))

    def test_django_runner_extension(self):
        from xmlrunner.extra.djangotestrunner import XMLTestRunner

        class MyDjangoRunner(XMLTestRunner):
            test_runner = mock.Mock()
        
        self._override_settings(
            TEST_OUTPUT_DIR=self.tmpdir,
            TEST_OUTPUT_VERBOSE=0)
        apps.populate(settings.INSTALLED_APPS)

        runner = MyDjangoRunner()
        suite = runner.build_suite(test_labels=None)
        runner.run_suite(suite)
        
        self.assertTrue(MyDjangoRunner.test_runner.called)


================================================
FILE: tests/doctest_example.py
================================================

def twice(n):
    """
    >>> twice(5)
    10
    """
    return 2 * n


class Multiplicator(object):
    def threetimes(self, n):
        """
        >>> Multiplicator().threetimes(5)
        15
        """
        return 3 * n


================================================
FILE: tests/testsuite.py
================================================
#!/usr/bin/env python
# -*- coding: utf-8 -*-

"""Executable module to test unittest-xml-reporting.
"""
from __future__ import print_function

import contextlib
import io
import sys

from xmlrunner.unittest import unittest
import xmlrunner
from xmlrunner.result import _DuplicateWriter
from xmlrunner.result import _XMLTestResult
from xmlrunner.result import resolve_filename
import doctest
import tests.doctest_example
from io import StringIO, BytesIO
from tempfile import mkdtemp
from tempfile import mkstemp
from shutil import rmtree
from glob import glob
from xml.dom import minidom
from lxml import etree
import os
import os.path
from unittest import mock


def _load_schema(version):
    path = os.path.join(
        os.path.dirname(__file__),
        'vendor/jenkins/xunit-plugin', version, 'junit-10.xsd')
    with open(path, 'r') as schema_file:
        schema_doc = etree.parse(schema_file)
        schema = etree.XMLSchema(schema_doc)
        return schema
    raise RuntimeError('Could not load JUnit schema')  # pragma: no cover


def validate_junit_report(version, text):
    document = etree.parse(BytesIO(text))
    schema = _load_schema(version)
    schema.assertValid(document)


class DoctestTest(unittest.TestCase):

    def test_doctest_example(self):
        suite = doctest.DocTestSuite(tests.doctest_example)
        outdir = BytesIO()
        stream = StringIO()
        runner = xmlrunner.XMLTestRunner(
            stream=stream, output=outdir, verbosity=0)
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()
        self.assertIn('classname="tests.doctest_example.Multiplicator"'.encode('utf8'), output)
        self.assertIn('name="threetimes"'.encode('utf8'), output)
        self.assertIn('classname="tests.doctest_example"'.encode('utf8'), output)
        self.assertIn('name="twice"'.encode('utf8'), output)


@contextlib.contextmanager
def capture_stdout_stderr():
    """
    context manager to capture stdout and stderr
    """
    orig_stdout = sys.stdout
    orig_stderr = sys.stderr
    sys.stdout = StringIO()
    sys.stderr = StringIO()
    try:
        yield (sys.stdout, sys.stderr)
    finally:
        sys.stdout = orig_stdout
        sys.stderr = orig_stderr


def _strip_xml(xml, changes):
    doc = etree.fromstring(xml)
    for xpath, attributes in changes.items():
        for node in doc.xpath(xpath):
            for attrib in node.attrib.keys():
                if attrib not in attributes:
                    del node.attrib[attrib]
    return etree.tostring(doc)


def some_decorator(f):
    # for issue #195
    code = """\
def wrapper(*args, **kwargs):
    return func(*args, **kwargs)
"""
    evaldict = dict(func=f)
    exec(code, evaldict)
    return evaldict['wrapper']


class XMLTestRunnerTestCase(unittest.TestCase):
    """
    XMLTestRunner test case.
    """
    class DummyTest(unittest.TestCase):

        @unittest.skip("demonstrating skipping")
        def test_skip(self):
            pass   # pragma: no cover

        @unittest.skip(u"demonstrating non-ascii skipping: éçà")
        def test_non_ascii_skip(self):
            pass   # pragma: no cover

        def test_pass(self):
            pass

        def test_fail(self):
            self.assertTrue(False)

        @unittest.expectedFailure
        def test_expected_failure(self):
            self.assertTrue(False)

        @unittest.expectedFailure
        def test_unexpected_success(self):
            pass

        def test_error(self):
            1 / 0

        def test_cdata_section(self):
            print('<![CDATA[content]]>')

        def test_invalid_xml_chars_in_doc(self):
            """
            Testing comments, -- is not allowed, or invalid xml 1.0 chars such as \x0c
            """
            pass

        def test_non_ascii_error(self):
            self.assertEqual(u"éçà", 42)

        def test_unsafe_unicode(self):
            print(u"A\x00B\x08C\x0BD\x0C")

        def test_output_stdout_and_stderr(self):
            print('test on stdout')
            print('test on stderr', file=sys.stderr)

        def test_runner_buffer_output_pass(self):
            print('should not be printed')

        def test_runner_buffer_output_fail(self):
            print('should be printed')
            self.fail('expected to fail')

        def test_output(self):
            print('test message')

        def test_non_ascii_runner_buffer_output_fail(self):
            print(u'Where is the café ?')
            self.fail(u'The café could not be found')

    class DummySubTest(unittest.TestCase):

        def test_subTest_pass(self):
            for i in range(2):
                with self.subTest(i=i):
                    pass

        def test_subTest_fail(self):
            for i in range(2):
                with self.subTest(i=i):
                    self.fail('this is a subtest.')

        def test_subTest_error(self):
            for i in range(2):
                with self.subTest(i=i):
                    raise Exception('this is a subtest')

        def test_subTest_mixed(self):
            for i in range(2):
                with self.subTest(i=i):
                    self.assertLess(i, 1, msg='this is a subtest.')

        def test_subTest_with_dots(self):
            for i in range(2):
                with self.subTest(module='hello.world.subTest{}'.format(i)):
                    self.fail('this is a subtest.')

    class DecoratedUnitTest(unittest.TestCase):

        @some_decorator
        def test_pass(self):
            pass

    class DummyErrorInCallTest(unittest.TestCase):

        def __call__(self, result):
            try:
                raise Exception('Massive fail')
            except Exception:
                result.addError(self, sys.exc_info())
                return

        def test_pass(self):
            # it is expected not to be called.
            pass  # pragma: no cover

    class DummyRefCountTest(unittest.TestCase):
        class dummy(object):
            pass
        def test_fail(self):
            inst = self.dummy()
            self.assertTrue(False)

    def setUp(self):
        self.stream = StringIO()
        self.outdir = mkdtemp()
        self.verbosity = 0
        self.runner_kwargs = {}
        self.addCleanup(rmtree, self.outdir)

    def _test_xmlrunner(self, suite, runner=None, outdir=None):
        if outdir is None:
            outdir = self.outdir
        stream = self.stream
        verbosity = self.verbosity
        runner_kwargs = self.runner_kwargs
        if runner is None:
            runner = xmlrunner.XMLTestRunner(
                stream=stream, output=outdir, verbosity=verbosity,
                **runner_kwargs)
        if isinstance(outdir, BytesIO):
            self.assertFalse(outdir.getvalue())
        else:
            self.assertEqual(0, len(glob(os.path.join(outdir, '*xml'))))
        runner.run(suite)
        if isinstance(outdir, BytesIO):
            self.assertTrue(outdir.getvalue())
        else:
            self.assertEqual(1, len(glob(os.path.join(outdir, '*xml'))))
        return runner

    def test_basic_unittest_constructs(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        suite.addTest(self.DummyTest('test_skip'))
        suite.addTest(self.DummyTest('test_fail'))
        suite.addTest(self.DummyTest('test_expected_failure'))
        suite.addTest(self.DummyTest('test_unexpected_success'))
        suite.addTest(self.DummyTest('test_error'))
        self._test_xmlrunner(suite)

    def test_classnames(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        suite.addTest(self.DummySubTest('test_subTest_pass'))
        outdir = BytesIO()
        stream = StringIO()
        runner = xmlrunner.XMLTestRunner(
            stream=stream, output=outdir, verbosity=0)
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()
        output = _strip_xml(output, {
            '//testsuite': (),
            '//testcase': ('classname', 'name'),
            '//failure': ('message',),
        })
        self.assertRegex(
            output,
            r'classname="tests\.testsuite\.(XMLTestRunnerTestCase\.)?'
            r'DummyTest" name="test_pass"'.encode('utf8'),
        )
        self.assertRegex(
            output,
            r'classname="tests\.testsuite\.(XMLTestRunnerTestCase\.)?'
            r'DummySubTest" name="test_subTest_pass"'.encode('utf8'),
        )

    def test_expected_failure(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_expected_failure'))
        outdir = BytesIO()

        self._test_xmlrunner(suite, outdir=outdir)

        self.assertNotIn(b'<failure', outdir.getvalue())
        self.assertNotIn(b'<error', outdir.getvalue())
        self.assertIn(b'<skip', outdir.getvalue())

    def test_unexpected_success(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_unexpected_success'))
        outdir = BytesIO()

        self._test_xmlrunner(suite, outdir=outdir)

        self.assertNotIn(b'<failure', outdir.getvalue())
        self.assertIn(b'<error', outdir.getvalue())
        self.assertNotIn(b'<skip', outdir.getvalue())

    def test_xmlrunner_non_ascii(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_non_ascii_skip'))
        suite.addTest(self.DummyTest('test_non_ascii_error'))
        outdir = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=self.stream, output=outdir, verbosity=self.verbosity,
            **self.runner_kwargs)
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()
        self.assertIn(
            u'message="demonstrating non-ascii skipping: éçà"'.encode('utf8'),
            output)

    def test_xmlrunner_safe_xml_encoding_name(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        outdir = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=self.stream, output=outdir, verbosity=self.verbosity,
            **self.runner_kwargs)
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()
        firstline = output.splitlines()[0]
        # test for issue #74
        self.assertIn('encoding="UTF-8"'.encode('utf8'), firstline)

    def test_xmlrunner_check_for_valid_xml_streamout(self):
        """
        This test checks if the xml document is valid if there are more than
        one testsuite and the output of the report is a single stream.
        """
        class DummyTestA(unittest.TestCase):

            def test_pass(self):
                pass

        class DummyTestB(unittest.TestCase):

            def test_pass(self):
                pass

        suite = unittest.TestSuite()
        suite.addTest(unittest.TestLoader().loadTestsFromTestCase(DummyTestA))
        suite.addTest(unittest.TestLoader().loadTestsFromTestCase(DummyTestB))
        outdir = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=self.stream, output=outdir, verbosity=self.verbosity,
            **self.runner_kwargs)
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()
        # Finally check if we have a valid XML document or not.
        try:
            minidom.parseString(output)
        except Exception as e:  # pragma: no cover
            # note: we could remove the try/except, but it's more crude.
            self.fail(e)

    def test_xmlrunner_unsafe_unicode(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_unsafe_unicode'))
        outdir = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=self.stream, output=outdir, verbosity=self.verbosity,
            **self.runner_kwargs)
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()
        self.assertIn(u"<![CDATA[ABCD\n]]>".encode('utf8'),
                      output)

    def test_xmlrunner_non_ascii_failures(self):
        self._xmlrunner_non_ascii_failures()

    def test_xmlrunner_non_ascii_failures_buffered_output(self):
        self._xmlrunner_non_ascii_failures(buffer=True)

    def _xmlrunner_non_ascii_failures(self, buffer=False):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest(
            'test_non_ascii_runner_buffer_output_fail'))
        outdir = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=self.stream, output=outdir, verbosity=self.verbosity,
            buffer=buffer, **self.runner_kwargs)

        # allow output non-ascii letters to stdout
        orig_stdout = sys.stdout
        sys.stdout = io.TextIOWrapper(sys.stdout.buffer, encoding='utf-8')

        try:
            runner.run(suite)
        finally:
            # Not to be closed when TextIOWrapper is disposed.
            sys.stdout.detach()
            sys.stdout = orig_stdout
        outdir.seek(0)
        output = outdir.read()
        self.assertIn(
            u'Where is the café ?'.encode('utf8'),
            output)
        self.assertIn(
            u'The café could not be found'.encode('utf8'),
            output)

    @unittest.expectedFailure
    def test_xmlrunner_buffer_output_pass(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_runner_buffer_output_pass'))
        self._test_xmlrunner(suite)
        testsuite_output = self.stream.getvalue()
        # Since we are always buffering stdout/stderr
        # it is currently troublesome to print anything at all
        # and be consistent with --buffer option (issue #59)
        self.assertIn('should not be printed', testsuite_output)
        # this will be fixed when using the composite approach
        # that was under development in the rewrite branch.

    def test_xmlrunner_buffer_output_fail(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_runner_buffer_output_fail'))
        # --buffer option
        self.runner_kwargs['buffer'] = True
        self._test_xmlrunner(suite)
        testsuite_output = self.stream.getvalue()
        self.assertIn('should be printed', testsuite_output)

    def test_xmlrunner_output_without_buffer(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_output'))
        with capture_stdout_stderr() as r:
            self._test_xmlrunner(suite)
        output_from_test = r[0].getvalue()
        self.assertIn('test message', output_from_test)

    def test_xmlrunner_output_with_buffer(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_output'))
        # --buffer option
        self.runner_kwargs['buffer'] = True
        with capture_stdout_stderr() as r:
            self._test_xmlrunner(suite)
        output_from_test = r[0].getvalue()
        self.assertNotIn('test message', output_from_test)

    def test_xmlrunner_stdout_stderr_recovered_without_buffer(self):
        orig_stdout = sys.stdout
        orig_stderr = sys.stderr
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        self._test_xmlrunner(suite)
        self.assertIs(orig_stdout, sys.stdout)
        self.assertIs(orig_stderr, sys.stderr)

    def test_xmlrunner_stdout_stderr_recovered_with_buffer(self):
        orig_stdout = sys.stdout
        orig_stderr = sys.stderr
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        # --buffer option
        self.runner_kwargs['buffer'] = True
        self._test_xmlrunner(suite)
        self.assertIs(orig_stdout, sys.stdout)
        self.assertIs(orig_stderr, sys.stderr)
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))

    @unittest.skipIf(not hasattr(unittest.TestCase, 'subTest'),
                     'unittest.TestCase.subTest not present.')
    def test_unittest_subTest_fail(self):
        # test for issue #77
        outdir = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=self.stream, output=outdir, verbosity=self.verbosity,
            **self.runner_kwargs)
        suite = unittest.TestSuite()
        suite.addTest(self.DummySubTest('test_subTest_fail'))
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()
        output = _strip_xml(output, {
            '//testsuite': (),
            '//testcase': ('classname', 'name'),
            '//failure': ('message',),
        })
        self.assertRegex(
            output,
            br'<testcase classname="tests\.testsuite\.'
            br'(XMLTestRunnerTestCase\.)?DummySubTest" '
            br'name="test_subTest_fail \(i=0\)"')
        self.assertRegex(
            output,
            br'<testcase classname="tests\.testsuite\.'
            br'(XMLTestRunnerTestCase\.)?DummySubTest" '
            br'name="test_subTest_fail \(i=1\)"')

    @unittest.skipIf(not hasattr(unittest.TestCase, 'subTest'),
                     'unittest.TestCase.subTest not present.')
    def test_unittest_subTest_error(self):
        # test for issue #155
        outdir = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=self.stream, output=outdir, verbosity=self.verbosity,
            **self.runner_kwargs)
        suite = unittest.TestSuite()
        suite.addTest(self.DummySubTest('test_subTest_error'))
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()
        output = _strip_xml(output, {
            '//testsuite': (),
            '//testcase': ('classname', 'name'),
            '//failure': ('message',),
        })
        self.assertRegex(
            output,
            br'<testcase classname="tests\.testsuite\.'
            br'(XMLTestRunnerTestCase\.)?DummySubTest" '
            br'name="test_subTest_error \(i=0\)"')
        self.assertRegex(
            output,
            br'<testcase classname="tests\.testsuite\.'
            br'(XMLTestRunnerTestCase\.)?DummySubTest" '
            br'name="test_subTest_error \(i=1\)"')


    @unittest.skipIf(not hasattr(unittest.TestCase, 'subTest'),
                     'unittest.TestCase.subTest not present.')
    def test_unittest_subTest_mixed(self):
        # test for issue #155
        outdir = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=self.stream, output=outdir, verbosity=self.verbosity,
            **self.runner_kwargs)
        suite = unittest.TestSuite()
        suite.addTest(self.DummySubTest('test_subTest_mixed'))
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()
        output = _strip_xml(output, {
            '//testsuite': (),
            '//testcase': ('classname', 'name'),
            '//failure': ('message',),
        })
        self.assertNotIn(
            'name="test_subTest_mixed (i=0)"'.encode('utf8'),
            output)
        self.assertIn(
            'name="test_subTest_mixed (i=1)"'.encode('utf8'),
            output)

    @unittest.skipIf(not hasattr(unittest.TestCase, 'subTest'),
                     'unittest.TestCase.subTest not present.')
    def test_unittest_subTest_pass(self):
        # Test for issue #85
        suite = unittest.TestSuite()
        suite.addTest(self.DummySubTest('test_subTest_pass'))
        self._test_xmlrunner(suite)

    @unittest.skipIf(not hasattr(unittest.TestCase, 'subTest'),
                     'unittest.TestCase.subTest not present.')
    def test_unittest_subTest_with_dots(self):
        # Test for issue #85
        suite = unittest.TestSuite()
        suite.addTest(self.DummySubTest('test_subTest_with_dots'))
        outdir = BytesIO()

        self._test_xmlrunner(suite, outdir=outdir)

        xmlcontent = outdir.getvalue().decode()

        # Method name
        self.assertNotIn('name="subTest', xmlcontent, 'parsing of test method name is not done correctly')
        self.assertIn('name="test_subTest_with_dots (module=\'hello.world.subTest', xmlcontent)

        # Class name
        matchString = 'classname="tests.testsuite.XMLTestRunnerTestCase.DummySubTest.test_subTest_with_dots (module=\'hello.world"'
        self.assertNotIn(matchString, xmlcontent, 'parsing of class name is not done correctly')
        self.assertIn('classname="tests.testsuite.XMLTestRunnerTestCase.DummySubTest"', xmlcontent)

    def test_xmlrunner_pass(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        self._test_xmlrunner(suite)

    def test_xmlrunner_failfast(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_fail'))
        suite.addTest(self.DummyTest('test_pass'))
        outdir = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=self.stream, output=outdir,
            verbosity=self.verbosity, failfast=True,
            **self.runner_kwargs)
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()
        self.assertIn('test_fail'.encode('utf8'), output)
        self.assertNotIn('test_pass'.encode('utf8'), output)

    def test_xmlrunner_verbose(self):
        self.verbosity = 1
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        self._test_xmlrunner(suite)

    def test_xmlrunner_showall(self):
        self.verbosity = 2
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        self._test_xmlrunner(suite)

    def test_xmlrunner_cdata_section(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_cdata_section'))
        self._test_xmlrunner(suite)

    def test_xmlrunner_invalid_xml_chars_in_doc(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_invalid_xml_chars_in_doc'))
        outdir = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=self.stream, output=outdir, verbosity=self.verbosity,
            **self.runner_kwargs)
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()
        # Finally check if we have a valid XML document or not.
        try:
            minidom.parseString(output)
        except Exception as e:  # pragma: no cover
            # note: we could remove the try/except, but it's more crude.
            self.fail(e)

    def test_xmlrunner_outsuffix(self):
        self.runner_kwargs['outsuffix'] = '.somesuffix'
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        self._test_xmlrunner(suite)
        xmlfile = glob(os.path.join(self.outdir, '*xml'))[0]
        assert xmlfile.endswith('.somesuffix.xml')

    def test_xmlrunner_nosuffix(self):
        self.runner_kwargs['outsuffix'] = ''
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        self._test_xmlrunner(suite)
        xmlfile = glob(os.path.join(self.outdir, '*xml'))[0]
        xmlfile = os.path.basename(xmlfile)
        assert xmlfile.endswith('DummyTest.xml')

    def test_junitxml_properties(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        suite.properties = dict(key='value')
        self._test_xmlrunner(suite)

    def test_junitxml_xsd_validation_order(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_fail'))
        suite.addTest(self.DummyTest('test_pass'))
        suite.addTest(self.DummyTest('test_output_stdout_and_stderr'))
        suite.properties = dict(key='value')
        outdir = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=self.stream, output=outdir, verbosity=self.verbosity,
            **self.runner_kwargs)
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()
        # poor man's schema validation; see issue #90
        i_properties = output.index('<properties>'.encode('utf8'))
        i_system_out = output.index('<system-out>'.encode('utf8'))
        i_system_err = output.index('<system-err>'.encode('utf8'))
        i_testcase = output.index('<testcase'.encode('utf8'))
        self.assertTrue(i_properties < i_testcase <
                        i_system_out < i_system_err)
        # XSD validation - for good measure.
        validate_junit_report('14c6e39c38408b9ed6280361484a13c6f5becca7', output)

    def test_junitxml_xsd_validation_empty_properties(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_fail'))
        suite.addTest(self.DummyTest('test_pass'))
        suite.properties = None
        outdir = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=self.stream, output=outdir, verbosity=self.verbosity,
            **self.runner_kwargs)
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()
        self.assertNotIn('<properties>'.encode('utf8'), output)
        validate_junit_report('14c6e39c38408b9ed6280361484a13c6f5becca7', output)

    @unittest.skipIf(hasattr(sys, 'pypy_version_info'),
                     'skip - PyPy + lxml seems to be hanging')
    def test_xunit_plugin_transform(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_fail'))
        suite.addTest(self.DummyTest('test_pass'))
        suite.properties = None
        outdir = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=self.stream, output=outdir, verbosity=self.verbosity,
            **self.runner_kwargs)
        runner.run(suite)
        outdir.seek(0)
        output = outdir.read()

        validate_junit_report('14c6e39c38408b9ed6280361484a13c6f5becca7', output)
        with self.assertRaises(etree.DocumentInvalid):
            validate_junit_report('ae25da5089d4f94ac6c4669bf736e4d416cc4665', output)

        from xmlrunner.extra.xunit_plugin import transform
        transformed = transform(output)
        validate_junit_report('14c6e39c38408b9ed6280361484a13c6f5becca7', transformed)
        validate_junit_report('ae25da5089d4f94ac6c4669bf736e4d416cc4665', transformed)
        self.assertIn('test_pass'.encode('utf8'), transformed)
        self.assertIn('test_fail'.encode('utf8'), transformed)

    def test_xmlrunner_elapsed_times(self):
        self.runner_kwargs['elapsed_times'] = False
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        self._test_xmlrunner(suite)

    def test_xmlrunner_resultclass(self):
        class Result(_XMLTestResult):
            pass

        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        self.runner_kwargs['resultclass'] = Result
        self._test_xmlrunner(suite)

    def test_xmlrunner_stream(self):
        stream = self.stream
        output = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=stream, output=output, verbosity=self.verbosity,
            **self.runner_kwargs)
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        runner.run(suite)

    def test_xmlrunner_stream_empty_testsuite(self):
        stream = self.stream
        output = BytesIO()
        runner = xmlrunner.XMLTestRunner(
            stream=stream, output=output, verbosity=self.verbosity,
            **self.runner_kwargs)
        suite = unittest.TestSuite()
        runner.run(suite)

    def test_xmlrunner_output_subdir(self):
        stream = self.stream
        output = os.path.join(self.outdir, 'subdir')
        runner = xmlrunner.XMLTestRunner(
            stream=stream, output=output, verbosity=self.verbosity,
            **self.runner_kwargs)
        suite = unittest.TestSuite()
        suite.addTest(self.DummyTest('test_pass'))
        runner.run(suite)

    def test_xmlrunner_patched_stdout(self):
        old_stdout, old_stderr = sys.stdout, sys.stderr
        try:
            sys.stdout, sys.stderr = StringIO(), StringIO()
            suite = unittest.TestSuite()
            suite.addTest(self.DummyTest('test_pass'))
            suite.properties = dict(key='value')
            self._test_xmlrunner(suite)
        finally:
            sys.stdout, sys.stderr = old_stdout, old_stderr

    def test_opaque_decorator(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DecoratedUnitTest('test_pass'))
        self._test_xmlrunner(suite)
        testsuite_output = self.stream.getvalue()
        self.assertNotIn('IOError:', testsuite_output)

    def test_xmlrunner_error_in_call(self):
        suite = unittest.TestSuite()
        suite.addTest(self.DummyErrorInCallTest('test_pass'))
        self._test_xmlrunner(suite)
        testsuite_output = self.stream.getvalue()
        self.assertIn('Exception: Massive fail', testsuite_output)

    @unittest.skipIf(not unittest.BaseTestSuite._cleanup,
                     'skip - do not cleanup')
    @unittest.skipIf(not hasattr(sys, 'getrefcount'),
                     'skip - PyPy does not have sys.getrefcount.')
    @unittest.skipIf((3, 0) <= sys.version_info < (3, 4),
                     'skip - test not garbage collected. '
                     'https://bugs.python.org/issue11798.')
    def test_xmlrunner_hold_traceback(self):
        import gc

        suite = unittest.TestSuite()
        suite.addTest(self.DummyRefCountTest('test_fail'))
        countBeforeTest = sys.getrefcount(self.DummyRefCountTest.dummy)
        runner = self._test_xmlrunner(suite)

        gc.collect()
        countAfterTest = sys.getrefcount(self.DummyRefCountTest.dummy)
        self.assertEqual(countBeforeTest, countAfterTest)

    class StderrXMLTestRunner(xmlrunner.XMLTestRunner):
        """
        XMLTestRunner that outputs to sys.stderr that might be replaced

        Though XMLTestRunner defaults to use sys.stderr as stream,
        it cannot be replaced e.g. by replaced by capture_stdout_stderr(),
        as it's already resolved.
        This class resolved sys.stderr lazily and outputs to replaced sys.stderr.
        """
        def __init__(self, **kwargs):
            super(XMLTestRunnerTestCase.StderrXMLTestRunner, self).__init__(
                stream=sys.stderr,
                **kwargs
            )

    def test_test_program_succeed_with_buffer(self):
        with capture_stdout_stderr() as r:
            unittest.TestProgram(
                module=self.__class__.__module__,
                testRunner=self.StderrXMLTestRunner,
                argv=[
                    sys.argv[0],
                    '-b',
                    'XMLTestRunnerTestCase.DummyTest.test_runner_buffer_output_pass',
                ],
                exit=False,
            )
        self.assertNotIn('should not be printed', r[0].getvalue())
        self.assertNotIn('should not be printed', r[1].getvalue())

    def test_test_program_succeed_wo_buffer(self):
        with capture_stdout_stderr() as r:
            unittest.TestProgram(
                module=self.__class__.__module__,
                testRunner=self.StderrXMLTestRunner,
                argv=[
                    sys.argv[0],
                    'XMLTestRunnerTestCase.DummyTest.test_runner_buffer_output_pass',
                ],
                exit=False,
            )
        self.assertIn('should not be printed', r[0].getvalue())
        self.assertNotIn('should not be printed', r[1].getvalue())

    def test_test_program_fail_with_buffer(self):
        with capture_stdout_stderr() as r:
            unittest.TestProgram(
                module=self.__class__.__module__,
                testRunner=self.StderrXMLTestRunner,
                argv=[
                    sys.argv[0],
                    '-b',
                    'XMLTestRunnerTestCase.DummyTest.test_runner_buffer_output_fail',
                ],
                exit=False,
            )
        self.assertNotIn('should be printed', r[0].getvalue())
        self.assertIn('should be printed', r[1].getvalue())

    def test_test_program_fail_wo_buffer(self):
        with capture_stdout_stderr() as r:
            unittest.TestProgram(
                module=self.__class__.__module__,
                testRunner=self.StderrXMLTestRunner,
                argv=[
                    sys.argv[0],
                    'XMLTestRunnerTestCase.DummyTest.test_runner_buffer_output_fail',
                ],
                exit=False,
            )
        self.assertIn('should be printed', r[0].getvalue())
        self.assertNotIn('should be printed', r[1].getvalue())

    def test_partialmethod(self):
        from functools import partialmethod
        def test_partialmethod(test):
            pass
        class TestWithPartialmethod(unittest.TestCase):
            pass
        setattr(
            TestWithPartialmethod,
            'test_partialmethod',
            partialmethod(test_partialmethod),
        )
        suite = unittest.TestSuite()
        suite.addTest(TestWithPartialmethod('test_partialmethod'))
        self._test_xmlrunner(suite)



class DuplicateWriterTestCase(unittest.TestCase):
    def setUp(self):
        fd, self.file = mkstemp()
        self.fh = os.fdopen(fd, 'w')
        self.buffer = StringIO()
        self.writer = _DuplicateWriter(self.fh, self.buffer)

    def tearDown(self):
        self.buffer.close()
        self.fh.close()
        os.unlink(self.file)

    def getFirstContent(self):
        with open(self.file, 'r') as f:
            return f.read()

    def getSecondContent(self):
        return self.buffer.getvalue()

    def test_flush(self):
        self.writer.write('foobarbaz')
        self.writer.flush()
        self.assertEqual(self.getFirstContent(), self.getSecondContent())

    def test_writable(self):
        self.assertTrue(self.writer.writable())

    def test_writelines(self):
        self.writer.writelines([
            'foo\n',
            'bar\n',
            'baz\n',
        ])
        self.writer.flush()
        self.assertEqual(self.getFirstContent(), self.getSecondContent())

    def test_write(self):
        # try long buffer (1M)
        buffer = 'x' * (1024 * 1024)
        wrote = self.writer.write(buffer)
        self.writer.flush()
        self.assertEqual(self.getFirstContent(), self.getSecondContent())
        self.assertEqual(wrote, len(self.getSecondContent()))


class XMLProgramTestCase(unittest.TestCase):
    @mock.patch('sys.argv', ['xmlrunner', '-o', 'flaf'])
    @mock.patch('xmlrunner.runner.XMLTestRunner')
    @mock.patch('sys.exit')
    def test_xmlrunner_output(self, exiter, testrunner):
        xmlrunner.runner.XMLTestProgram()

        kwargs = dict(
            buffer=mock.ANY,
            failfast=mock.ANY,
            verbosity=mock.ANY,
            warnings=mock.ANY,
            output='flaf',
        )

        if sys.version_info[:2] > (3, 4):
            kwargs.update(tb_locals=mock.ANY)

        testrunner.assert_called_once_with(**kwargs)
        exiter.assert_called_once_with(False)

    @mock.patch('sys.argv', ['xmlrunner', '--output-file', 'test.xml'])
    @mock.patch('xmlrunner.runner.open')
    @mock.patch('xmlrunner.runner.XMLTestRunner')
    @mock.patch('sys.exit')
    def test_xmlrunner_output_file(self, exiter, testrunner, opener):
        xmlrunner.runner.XMLTestProgram()
        opener.assert_called_once_with('test.xml', 'wb')
        open_file = opener()
        open_file.close.assert_called_with()

        kwargs = dict(
            buffer=mock.ANY,
            failfast=mock.ANY,
            verbosity=mock.ANY,
            warnings=mock.ANY,
            output=open_file,
        )

        if sys.version_info[:2] > (3, 4):
            kwargs.update(tb_locals=mock.ANY)

        testrunner.assert_called_once_with(**kwargs)
        exiter.assert_called_once_with(False)

    @mock.patch('sys.argv', ['xmlrunner', '--outsuffix', ''])
    @mock.patch('xmlrunner.runner.open')
    @mock.patch('xmlrunner.runner.XMLTestRunner')
    @mock.patch('sys.exit')
    def test_xmlrunner_outsuffix(self, exiter, testrunner, opener):
        xmlrunner.runner.XMLTestProgram()

        kwargs = dict(
            buffer=mock.ANY,
            failfast=mock.ANY,
            verbosity=mock.ANY,
            warnings=mock.ANY,
            outsuffix='',
        )

        if sys.version_info[:2] > (3, 4):
            kwargs.update(tb_locals=mock.ANY)

        testrunner.assert_called_once_with(**kwargs)
        exiter.assert_called_once_with(False)


class ResolveFilenameTestCase(unittest.TestCase):
    @mock.patch('os.path.relpath')
    def test_resolve_filename_relative(self, relpath):
        relpath.return_value = 'somefile.py'
        filename = resolve_filename('/path/to/somefile.py')
        self.assertEqual(filename, 'somefile.py')

    @mock.patch('os.path.relpath')
    def test_resolve_filename_outside(self, relpath):
        relpath.return_value = '../../../tmp/somefile.py'
        filename = resolve_filename('/tmp/somefile.py')
        self.assertEqual(filename, '/tmp/somefile.py')

    @mock.patch('os.path.relpath')
    def test_resolve_filename_error(self, relpath):
        relpath.side_effect = ValueError("ValueError: path is on mount 'C:', start on mount 'D:'")
        filename = resolve_filename('C:\\path\\to\\somefile.py')
        self.assertEqual(filename, 'C:\\path\\to\\somefile.py')


================================================
FILE: tests/vendor/jenkins/xunit-plugin/14c6e39c38408b9ed6280361484a13c6f5becca7/junit-10.xsd
================================================
<?xml version="1.0" encoding="UTF-8" ?>
<!--
The MIT License (MIT)

Copyright (c) 2014, Gregory Boissinot

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
-->
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">

    <xs:element name="failure">
        <xs:complexType mixed="true">
            <xs:attribute name="type" type="xs:string" use="optional"/>
            <xs:attribute name="message" type="xs:string" use="optional"/>
        </xs:complexType>
    </xs:element>

    <xs:element name="error">
        <xs:complexType mixed="true">
            <xs:attribute name="type" type="xs:string" use="optional"/>
            <xs:attribute name="message" type="xs:string" use="optional"/>
        </xs:complexType>
    </xs:element>

    <xs:element name="skipped">
        <xs:complexType mixed="true">
            <xs:attribute name="type" type="xs:string" use="optional"/>
            <xs:attribute name="message" type="xs:string" use="optional"/>
        </xs:complexType>
    </xs:element>

    <xs:element name="properties">
        <xs:complexType>
            <xs:sequence>
                <xs:element ref="property" minOccurs="0" maxOccurs="unbounded"/>
            </xs:sequence>
        </xs:complexType>
    </xs:element>

    <xs:element name="property">
        <xs:complexType>
            <xs:attribute name="name" type="xs:string" use="required"/>
            <xs:attribute name="value" type="xs:string" use="required"/>
        </xs:complexType>
    </xs:element>

    <xs:element name="system-err" type="xs:string"/>
    <xs:element name="system-out" type="xs:string"/>

    <xs:element name="testcase">
        <xs:complexType>
            <xs:sequence>
                <xs:choice minOccurs="0" maxOccurs="unbounded">
                    <xs:element ref="skipped"/>
                    <xs:element ref="error"/>
                    <xs:element ref="failure"/>
                    <xs:element ref="system-out"/>
                    <xs:element ref="system-err"/>
                </xs:choice>
            </xs:sequence>
            <xs:attribute name="name" type="xs:string" use="required"/>
            <xs:attribute name="assertions" type="xs:string" use="optional"/>
            <xs:attribute name="time" type="xs:string" use="optional"/>
            <xs:attribute name="timestamp" type="xs:string" use="optional"/>
            <xs:attribute name="classname" type="xs:string" use="optional"/>
            <xs:attribute name="status" type="xs:string" use="optional"/>
            <xs:attribute name="class" type="xs:string" use="optional"/>
            <xs:attribute name="file" type="xs:string" use="optional"/>
            <xs:attribute name="line" type="xs:string" use="optional"/>
            <xs:attribute name="log" type="xs:string" use="optional"/>
            <xs:attribute name="group" type="xs:string" use="optional"/>
            <xs:attribute name="url" type="xs:string" use="optional"/>
        </xs:complexType>
    </xs:element>

    <xs:element name="testsuite">
        <xs:complexType>
            <xs:choice minOccurs="0" maxOccurs="unbounded">
                <xs:element ref="testsuite"/>
                <xs:element ref="properties"/>
                <xs:element ref="testcase"/>
                <xs:element ref="system-out"/>
                <xs:element ref="system-err"/>
            </xs:choice>
            <xs:attribute name="name" type="xs:string" use="optional"/>
            <xs:attribute name="tests" type="xs:string" use="required"/>
            <xs:attribute name="failures" type="xs:string" use="optional"/>
            <xs:attribute name="errors" type="xs:string" use="optional"/>
            <xs:attribute name="time" type="xs:string" use="optional"/>
            <xs:attribute name="disabled" type="xs:string" use="optional"/>
            <xs:attribute name="skipped" type="xs:string" use="optional"/>
            <xs:attribute name="skips" type="xs:string" use="optional"/>
            <xs:attribute name="timestamp" type="xs:string" use="optional"/>
            <xs:attribute name="hostname" type="xs:string" use="optional"/>
            <xs:attribute name="id" type="xs:string" use="optional"/>
            <xs:attribute name="package" type="xs:string" use="optional"/>
            <xs:attribute name="assertions" type="xs:string" use="optional"/>
            <xs:attribute name="file" type="xs:string" use="optional"/>
            <xs:attribute name="skip" type="xs:string" use="optional"/>
            <xs:attribute name="log" type="xs:string" use="optional"/>
            <xs:attribute name="url" type="xs:string" use="optional"/>
        </xs:complexType>
    </xs:element>

    <xs:element name="testsuites">
        <xs:complexType>
            <xs:sequence>
                <xs:element ref="testsuite" minOccurs="0" maxOccurs="unbounded"/>
            </xs:sequence>
            <xs:attribute name="name" type="xs:string" use="optional"/>
            <xs:attribute name="time" type="xs:string" use="optional"/>
            <xs:attribute name="tests" type="xs:string" use="optional"/>
            <xs:attribute name="failures" type="xs:string" use="optional"/>
            <xs:attribute name="disabled" type="xs:string" use="optional"/>
            <xs:attribute name="errors" type="xs:string" use="optional"/>
        </xs:complexType>
    </xs:element>

</xs:schema>


================================================
FILE: tests/vendor/jenkins/xunit-plugin/ae25da5089d4f94ac6c4669bf736e4d416cc4665/junit-10.xsd
================================================
<?xml version="1.0" encoding="UTF-8" ?>
<!--
The MIT License (MIT)

Copyright (c) 2014, Gregory Boissinot

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
-->
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">
    <xs:simpleType name="SUREFIRE_TIME">
        <xs:restriction base="xs:string">
            <xs:pattern value="(([0-9]{0,3},)*[0-9]{3}|[0-9]{0,3})*(\.[0-9]{0,3})?"/>
        </xs:restriction>
    </xs:simpleType>

    <xs:complexType name="rerunType" mixed="true"> <!-- mixed (XML contains text) to be compatible with version previous than 2.22.1 -->
        <xs:sequence>
            <xs:element name="stackTrace" type="xs:string" minOccurs="0" /> <!-- optional to be compatible with version previous than 2.22.1 -->
            <xs:element name="system-out" type="xs:string" minOccurs="0" />
            <xs:element name="system-err" type="xs:string" minOccurs="0" />
        </xs:sequence>
        <xs:attribute name="message" type="xs:string" />
        <xs:attribute name="type" type="xs:string" use="required" />
    </xs:complexType>

    <xs:element name="failure">
        <xs:complexType mixed="true">
            <xs:attribute name="type" type="xs:string"/>
            <xs:attribute name="message" type="xs:string"/>
        </xs:complexType>
    </xs:element>

    <xs:element name="error">
        <xs:complexType mixed="true">
            <xs:attribute name="type" type="xs:string"/>
            <xs:attribute name="message" type="xs:string"/>
        </xs:complexType>
    </xs:element>

    <xs:element name="skipped">
        <xs:complexType mixed="true">
            <xs:attribute name="type" type="xs:string"/>
            <xs:attribute name="message" type="xs:string"/>
        </xs:complexType>
    </xs:element>

    <xs:element name="properties">
        <xs:complexType>
            <xs:sequence>
                <xs:element ref="property" minOccurs="0" maxOccurs="unbounded"/>
            </xs:sequence>
        </xs:complexType>
    </xs:element>

    <xs:element name="property">
        <xs:complexType>
            <xs:attribute name="name" type="xs:string" use="required"/>
            <xs:attribute name="value" type="xs:string" use="required"/>
        </xs:complexType>
    </xs:element>

    <xs:element name="system-err" type="xs:string"/>
    <xs:element name="system-out" type="xs:string"/>
    <xs:element name="rerunFailure" type="rerunType"/>
    <xs:element name="rerunError" type="rerunType"/>
    <xs:element name="flakyFailure" type="rerunType"/>
    <xs:element name="flakyError" type="rerunType"/>

    <xs:element name="testcase">
        <xs:complexType>
            <xs:sequence>
                <xs:choice minOccurs="0" maxOccurs="unbounded">
                    <xs:element ref="skipped"/>
                    <xs:element ref="error"/>
                    <xs:element ref="failure"/>
                    <xs:element ref="rerunFailure" minOccurs="0" maxOccurs="unbounded"/>
                    <xs:element ref="rerunError" minOccurs="0" maxOccurs="unbounded"/>
                    <xs:element ref="flakyFailure" minOccurs="0" maxOccurs="unbounded"/>
                    <xs:element ref="flakyError" minOccurs="0" maxOccurs="unbounded"/>
                    <xs:element ref="system-out"/>
                    <xs:element ref="system-err"/>
                </xs:choice>
            </xs:sequence>
            <xs:attribute name="name" type="xs:string" use="required"/>
            <xs:attribute name="time" type="xs:string"/>
            <xs:attribute name="classname" type="xs:string"/>
            <xs:attribute name="group" type="xs:string"/>
        </xs:complexType>
    </xs:element>

    <xs:element name="testsuite">
        <xs:complexType>
            <xs:choice minOccurs="0" maxOccurs="unbounded">
                <xs:element ref="testsuite"/>
                <xs:element ref="properties"/>
                <xs:element ref="testcase"/>
                <xs:element ref="system-out"/>
                <xs:element ref="system-err"/>
            </xs:choice>
            <xs:attribute name="name" type="xs:string" use="required"/>
            <xs:attribute name="tests" type="xs:string" use="required"/>
            <xs:attribute name="failures" type="xs:string" use="required"/>
            <xs:attribute name="errors" type="xs:string" use="required"/>
            <xs:attribute name="group" type="xs:string" />
            <xs:attribute name="time" type="SUREFIRE_TIME"/>
            <xs:attribute name="skipped" type="xs:string" />
            <xs:attribute name="timestamp" type="xs:string" />
            <xs:attribute name="hostname" type="xs:string" />
            <xs:attribute name="id" type="xs:string" />
            <xs:attribute name="package" type="xs:string" />
            <xs:attribute name="file" type="xs:string"/>
            <xs:attribute name="log" type="xs:string"/>
            <xs:attribute name="url" type="xs:string"/>
            <xs:attribute name="version" type="xs:string"/>
        </xs:complexType>
    </xs:element>

    <xs:element name="testsuites">
        <xs:complexType>
            <xs:sequence>
                <xs:element ref="testsuite" minOccurs="0" maxOccurs="unbounded" />
            </xs:sequence>
            <xs:attribute name="name" type="xs:string" />
            <xs:attribute name="time" type="SUREFIRE_TIME"/>
            <xs:attribute name="tests" type="xs:string" />
            <xs:attribute name="failures" type="xs:string" />
            <xs:attribute name="errors" type="xs:string" />
        </xs:complexType>
    </xs:element>

</xs:schema>


================================================
FILE: tox.ini
================================================
[pytest]
python_files = *_test.py test*.py
testpaths = tests
norecursedirs = tests/django_example

[tox]
envlist = begin,py{py3,310,311,312,313,314},pytest,py314-django{lts,curr},end,quality

[gh-actions]
python =
    3.10: py310
    3.11: py311
    3.12: py312
    3.13: py313
    3.14: begin,py314,py314-django{tls,curr},end,quality

[testenv]
deps =
    coverage
    codecov>=1.4.0
    coveralls
    djangolts: django~=4.2.0
    djangocurr: django~=5.2.0
    pytest
    lxml>=3.6.0
commands =
    python -m coverage run --append -m pytest
    python -m coverage report --omit='.tox/*'
    python -m xmlrunner discover -p "discovery_test.py"
passenv =
    CI
    TRAVIS
    TRAVIS_*
    CODECOV_TOKEN
    COVERALLS_REPO_TOKEN
    COVERALLS_*
    GITHUB_ACTION
    GITHUB_HEAD_REF
    GITHUB_REF
    GITHUB_REPOSITORY
    GITHUB_RUN_ID
    GITHUB_SHA
    GITHUB_TOKEN

[testenv:uploadcodecov]
commands =
    codecov -e TOXENV

[testenv:uploadcoveralls]
commands =
    -coveralls --service=github

[testenv:finishcoveralls]
commands =
    -coveralls --service=github --finish

[testenv:pytest]
commands = pytest

[testenv:begin]
commands = coverage erase

[testenv:end]
commands =
    coverage report
    coverage html

[testenv:quality]
ignore_outcome = True
deps =
    mccabe
    pylint
    flake8
    pyroma
    pep257
commands =
    pylint xmlrunner tests
    flake8 --max-complexity 10
    pyroma .
    pep257


================================================
FILE: xmlrunner/__init__.py
================================================
# -*- coding: utf-8 -*-

"""
This module provides the XMLTestRunner class, which is heavily based on the
default TextTestRunner.
"""

# Allow version to be detected at runtime.
try:
    from .version import __version__
except ImportError:
    __version__ = "unknown"

from .runner import XMLTestRunner

__all__ = ('__version__', 'XMLTestRunner')


================================================
FILE: xmlrunner/__main__.py
================================================
"""Main entry point"""

import sys
from .runner import XMLTestProgram

if sys.argv[0].endswith("__main__.py"):
    import os.path
    # We change sys.argv[0] to make help message more useful
    # use executable without path, unquoted
    # (it's just a hint anyway)
    # (if you have spaces in your executable you get what you deserve!)
    executable = os.path.basename(sys.executable)
    sys.argv[0] = executable + " -m xmlrunner"
    del os

__unittest = True


XMLTestProgram(module=None)


================================================
FILE: xmlrunner/builder.py
================================================
import re
import sys
import datetime
import time

from xml.dom.minidom import Document


__all__ = ('TestXMLBuilder', 'TestXMLContext')


# see issue #74, the encoding name needs to be one of
# http://www.iana.org/assignments/character-sets/character-sets.xhtml
UTF8 = 'UTF-8'

# Workaround for Python bug #5166
# http://bugs.python.org/issue5166

_char_tail = ''

if sys.maxunicode > 0x10000:
    _char_tail = (u'%s-%s') % (
        chr(0x10000),
        chr(min(sys.maxunicode, 0x10FFFF))
    )

_nontext_sub = re.compile(
    r'[^\x09\x0A\x0D\x20-\uD7FF\uE000-\uFFFD%s]' % _char_tail,
    re.U
).sub


def replace_nontext(text, replacement=u'\uFFFD'):
    return _nontext_sub(replacement, text)


class TestXMLContext(object):
    """A XML report file have a distinct hierarchy. The outermost element is
    'testsuites', which contains one or more 'testsuite' elements. The role of
    these elements is to give the proper context to 'testcase' elements.

    These contexts have a few things in common: they all have some sort of
    counters (i.e. how many testcases are inside that context, how many failed,
    and so on), they all have a 'time' attribute indicating how long it took
    for their testcases to run, etc.

    The purpose of this class is to abstract the job of composing this
    hierarchy while keeping track of counters and how long it took for a
    context to be processed.
    """

    # Allowed keys for self.counters
    _allowed_counters = ('tests', 'errors', 'failures', 'skipped',)

    def __init__(self, xml_doc, parent_context=None):
        """Creates a new instance of a root or nested context (depending whether
        `parent_context` is provided or not).
        """
        self.xml_doc = xml_doc
        self.parent = parent_context
        self._start_time_m = 0
        self._stop_time_m = 0
        self._stop_time = 0
        self.counters = {}

    def element_tag(self):
        """Returns the name of the tag represented by this context.
        """
        return self.element.tagName

    def begin(self, tag, name):
        """Begins the creation of this context in the XML document by creating
        an empty tag <tag name='param'>.
        """
        self.element = self.xml_doc.createElement(tag)
        self.element.setAttribute('name', replace_nontext(name))
        self._start_time = time.monotonic()

    def end(self):
        """Closes this context (started with a call to `begin`) and creates an
        attribute for each counter and another for the elapsed time.
        """
        # time.monotonic is reliable for measuring differences, not affected by NTP
        self._stop_time_m = time.monotonic()
        # time.time is used for reference point
        self._stop_time = time.time()
        self.element.setAttribute('time', self.elapsed_time())
        self.element.setAttribute('timestamp', self.timestamp())
        self._set_result_counters()
        return self.element

    def _set_result_counters(self):
        """Sets an attribute in this context's tag for each counter considering
        what's valid for each tag name.
        """
        tag = self.element_tag()

        for counter_name in TestXMLContext._allowed_counters:
            valid_counter_for_element = False

            if counter_name == 'skipped':
                valid_counter_for_element = (
                    tag == 'testsuite'
                )
            else:
                valid_counter_for_element = (
                    tag in ('testsuites', 'testsuite')
                )

            if valid_counter_for_element:
                value = str(
                    self.counters.get(counter_name, 0)
                )
                self.element.setAttribute(counter_name, value)

    def increment_counter(self, counter_name):
        """Increments a counter named by `counter_name`, which can be any one
        defined in `_allowed_counters`.
        """
        if counter_name in TestXMLContext._allowed_counters:
            self.counters[counter_name] = \
                self.counters.get(counter_name, 0) + 1

    def elapsed_time(self):
        """Returns the time the context took to run between the calls to
        `begin()` and `end()`, in seconds.
        """
        return format(self._stop_time_m - self._start_time_m, '.3f')

    def timestamp(self):
        """Returns the time the context ended as ISO-8601-formatted timestamp.
        """
        return datetime.datetime.fromtimestamp(self._stop_time).replace(microsecond=0).isoformat()


class TestXMLBuilder(object):
    """This class encapsulates most rules needed to create a XML test report
    behind a simple interface.
    """

    def __init__(self):
        """Creates a new instance.
        """
        self._xml_doc = Document()
        self._current_context = None

    def current_context(self):
        """Returns the current context.
        """
        return self._current_context

    def begin_context(self, tag, name):
        """Begins a new context in the XML test report, which usually is defined
        by one on the tags 'testsuites', 'testsuite', or 'testcase'.
        """
        context = TestXMLContext(self._xml_doc, self._current_context)
        context.begin(tag, name)

        self._current_context = context

    def context_tag(self):
        """Returns the tag represented by the current context.
        """
        return self._current_context.element_tag()

    def _create_cdata_section(self, content):
        """Returns a new CDATA section containing the string defined in
        `content`.
        """
        filtered_content = replace_nontext(content)
        return self._xml_doc.createCDATASection(filtered_content)

    def append_cdata_section(self, tag, content):
        """Appends a tag in the format <tag>CDATA</tag> into the tag represented
        by the current context. Returns the created tag.
        """
        element = self._xml_doc.createElement(tag)

        pos = content.find(']]>')
        while pos >= 0:
            tmp = content[0:pos+2]
            element.appendChild(self._create_cdata_section(tmp))
            content = content[pos+2:]
            pos = content.find(']]>')

        element.appendChild(self._create_cdata_section(content))

        self._append_child(element)
        return element

    def append(self, tag, content, **kwargs):
        """Apends a tag in the format <tag attr='val' attr2='val2'>CDATA</tag>
        into the tag represented by the current context. Returns the created
        tag.
        """
        element = self._xml_doc.createElement(tag)

        for key, value in kwargs.items():
            filtered_value = replace_nontext(str(value))
            element.setAttribute(key, filtered_value)

        if content:
            element.appendChild(self._create_cdata_section(content))

        self._append_child(element)
        return element

    def _append_child(self, element):
        """Appends a tag object represented by `element` into the tag
        represented by the current context.
        """
        if self._current_context:
            self._current_context.element.appendChild(element)
        else:
            self._xml_doc.appendChild(element)

    def increment_counter(self, counter_name):
        """Increments a counter in the current context and their parents.
        """
        context = self._current_context

        while context:
            context.increment_counter(counter_name)
            context = context.parent

    def end_context(self):
        """Ends the current context and sets the current context as being the
        previous one (if it exists). Also, when a context ends, its tag is
        appended in the proper place inside the document.
        """
        if not self._current_context:
            return False

        element = self._current_context.end()

        self._current_context = self._current_context.parent
        self._append_child(element)

        return True

    def finish(self):
        """Ends all open contexts and returns a pretty printed version of the
        generated XML document.
        """
        while self.end_context():
            pass
        return self._xml_doc.toprettyxml(indent='\t', encoding=UTF8)


================================================
FILE: xmlrunner/extra/__init__.py
================================================


================================================
FILE: xmlrunner/extra/djangotestrunner.py
================================================
# -*- coding: utf-8 -*-

"""
Custom Django test runner that runs the tests using the
XMLTestRunner class.

This script shows how to use the XMLTestRunner in a Django project. To learn
how to configure a custom TestRunner in a Django project, please read the
Django docs website.
"""

import os
import xmlrunner
import os.path
from django.conf import settings
from django.test.runner import DiscoverRunner


class XMLTestRunner(DiscoverRunner):
    test_runner = xmlrunner.XMLTestRunner

    def get_resultclass(self):
        # Django provides `DebugSQLTextTestResult` if `debug_sql` argument is True
        # To use `xmlrunner.result._XMLTestResult` we supress default behavior
        return None

    def get_test_runner_kwargs(self):
        # We use separate verbosity setting for our runner
        verbosity = getattr(settings, 'TEST_OUTPUT_VERBOSE', 1)
        if isinstance(verbosity, bool):
            verbosity = (1, 2)[verbosity]
        verbosity = verbosity  # not self.verbosity

        output_dir = getattr(settings, 'TEST_OUTPUT_DIR', '.')
        single_file = getattr(settings, 'TEST_OUTPUT_FILE_NAME', None)

        # For single file case we are able to create file here
        # But for multiple files case files will be created inside runner/results
        if single_file is None:  # output will be a path (folder)
            output = output_dir
        else:  # output will be a stream
            if not os.path.exists(output_dir):
                os.makedirs(output_dir)
            file_path = os.path.join(output_dir, single_file)
            output = open(file_path, 'wb')

        return dict(
            verbosity=verbosity,
            descriptions=getattr(settings, 'TEST_OUTPUT_DESCRIPTIONS', False),
            failfast=self.failfast,
            resultclass=self.get_resultclass(),
            output=output,
        )

    def run_suite(self, suite, **kwargs):
        runner_kwargs = self.get_test_runner_kwargs()
        runner = self.test_runner(**runner_kwargs)
        results = runner.run(suite)
        if hasattr(runner_kwargs['output'], 'close'):
            runner_kwargs['output'].close()
        return results


================================================
FILE: xmlrunner/extra/xunit_plugin.py
================================================
import io
import lxml.etree as etree


TRANSFORM = etree.XSLT(etree.XML(b'''\
<?xml version="1.0" encoding="UTF-8"?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="2.0">
    <xsl:output method="xml" indent="yes" />

    <!-- /dev/null for these attributes -->
    <xsl:template match="//testcase/@file" />
    <xsl:template match="//testcase/@line" />
    <xsl:template match="//testcase/@timestamp" />

    <!-- copy the rest -->
    <xsl:template match="node()|@*">
        <xsl:copy>
            <xsl:apply-templates select="node()|@*" />
        </xsl:copy>
    </xsl:template>
</xsl:stylesheet>'''))


def transform(xml_data):
    out = io.BytesIO()
    xml_doc = etree.XML(xml_data)
    result = TRANSFORM(xml_doc)
    result.write(out)
    return out.getvalue()


================================================
FILE: xmlrunner/result.py
================================================

import inspect
import io
import os
import sys
import datetime
import traceback
import re
from os import path
from io import StringIO

# use direct import to bypass freezegun
from time import time

from .unittest import TestResult, TextTestResult, failfast


# Matches invalid XML1.0 unicode characters, like control characters:
# http://www.w3.org/TR/2006/REC-xml-20060816/#charsets
# http://stackoverflow.com/questions/1707890/fast-way-to-filter-illegal-xml-unicode-chars-in-python

_illegal_unichrs = [
    (0x00, 0x08), (0x0B, 0x0C), (0x0E, 0x1F),
    (0x7F, 0x84), (0x86, 0x9F),
    (0xFDD0, 0xFDDF), (0xFFFE, 0xFFFF),
]
if sys.maxunicode >= 0x10000:  # not narrow build
    _illegal_unichrs.extend([
        (0x1FFFE, 0x1FFFF), (0x2FFFE, 0x2FFFF),
        (0x3FFFE, 0x3FFFF), (0x4FFFE, 0x4FFFF),
        (0x5FFFE, 0x5FFFF), (0x6FFFE, 0x6FFFF),
        (0x7FFFE, 0x7FFFF), (0x8FFFE, 0x8FFFF),
        (0x9FFFE, 0x9FFFF), (0xAFFFE, 0xAFFFF),
        (0xBFFFE, 0xBFFFF), (0xCFFFE, 0xCFFFF),
        (0xDFFFE, 0xDFFFF), (0xEFFFE, 0xEFFFF),
        (0xFFFFE, 0xFFFFF), (0x10FFFE, 0x10FFFF),
    ])

_illegal_ranges = [
    "%s-%s" % (chr(low), chr(high))
    for (low, high) in _illegal_unichrs
]

INVALID_XML_1_0_UNICODE_RE = re.compile(u'[%s]' % u''.join(_illegal_ranges))


STDOUT_LINE = '\nStdout:\n%s'
STDERR_LINE = '\nStderr:\n%s'


def safe_unicode(data, encoding='utf8'):
    """Return a unicode string containing only valid XML characters.

    encoding - if data is a byte string it is first decoded to unicode
        using this encoding.
    """
    data = str(data)
    return INVALID_XML_1_0_UNICODE_RE.sub('', data)


def testcase_name(test_method):
    testcase = type(test_method)

    # Ignore module name if it is '__main__'
    module = testcase.__module__ + '.'
    if module == '__main__.':
        module = ''
    result = module + testcase.__name__
    return result


def resolve_filename(filename):
    # Try to make filename relative to current directory.
    try:
        rel_filename = os.path.relpath(filename)
    except ValueError:
        return filename
    # if not inside folder, keep as-is
    return filename if rel_filename.startswith('../') else rel_filename


class _DuplicateWriter(io.TextIOBase):
    """
    Duplicate output from the first handle to the second handle

    The second handle is expected to be a StringIO and not to block.
    """

    def __init__(self, first, second):
        super(_DuplicateWriter, self).__init__()
        self._first = first
        self._second = second

    def flush(self):
        try:
            self._first.flush()
        except ValueError:
            pass
        try:
            self._second.flush()
        except ValueError:
            pass

    def writable(self):
        return True

    def getvalue(self):
        return self._second.getvalue()

    def writelines(self, lines):
        self._first.writelines(lines)
        self._second.writelines(lines)

    def write(self, b):
        if isinstance(self._first, io.TextIOBase):
            wrote = self._first.write(b)

            if wrote is not None:
                # expected to always succeed to write
                self._second.write(b[:wrote])

            return wrote
        else:
            # file-like object that doesn't return wrote bytes.
            self._first.write(b)
            self._second.write(b)
            return len(b)


class _TestInfo(object):
    """
    This class keeps useful information about the execution of a
    test method.
    """

    # Possible test outcomes
    (SUCCESS, FAILURE, ERROR, SKIP) = range(4)

    OUTCOME_ELEMENTS = {
        SUCCESS: None,
        FAILURE: 'failure',
        ERROR: 'error',
        SKIP: 'skipped',
    }

    def __init__(self, test_result, test_method, outcome=SUCCESS, err=None, subTest=None, filename=None, lineno=None, doc=None):
        self.test_result = test_result
        self.outcome = outcome
        self.elapsed_time = 0
        self.timestamp = datetime.datetime.min.replace(microsecond=0).isoformat()
        if err is not None:
            if self.outcome != _TestInfo.SKIP:
                self.test_exception_name = safe_unicode(err[0].__name__)
                self.test_exception_message = safe_unicode(err[1])
            else:
                self.test_exception_message = safe_unicode(err)

        self.stdout = test_result._stdout_data
        self.stderr = test_result._stderr_data

        self.test_description = self.test_result.getDescription(test_method)
        self.test_exception_info = (
            '' if outcome in (self.SUCCESS, self.SKIP)
            else self.test_result._exc_info_to_string(
                    err, test_method)
        )

        self.test_name = testcase_name(test_method)
        self.test_id = test_method.id()

        if subTest:
            self.test_id = subTest.id()
            self.test_description = self.test_result.getDescription(subTest)

        self.filename = filename
        self.lineno = lineno
        self.doc = doc

    def id(self):
        return self.test_id

    def test_finished(self):
        """Save info that can only be calculated once a test has run.
        """
        self.elapsed_time = \
            self.test_result.stop_time - self.test_result.start_time
        timestamp = datetime.datetime.fromtimestamp(self.test_result.stop_time)
        self.timestamp = timestamp.replace(microsecond=0).isoformat()

    def get_error_info(self):
        """
        Return a text representation of an exception thrown by a test
        method.
        """
        return self.test_exception_info

    def shortDescription(self):
        return self.test_description


class _XMLTestResult(TextTestResult):
    """
    A test result class that can express test results in a XML report.

    Used by XMLTestRunner.
    """
    def __init__(self, stream=sys.stderr, descriptions=1, verbosity=1,
                 elapsed_times=True, properties=None, infoclass=None):
        TextTestResult.__init__(self, stream, descriptions, verbosity)
        self._stdout_data = None
        self._stderr_data = None
        self._stdout_capture = StringIO()
        self.__stdout_saved = None
        self._stderr_capture = StringIO()
        self.__stderr_saved = None
        self.successes = []
        self.callback = None
        self.elapsed_times = elapsed_times
        self.properties = properties  # junit testsuite properties
        self.filename = None
        self.lineno = None
        self.doc = None
        if infoclass is None:
            self.infoclass = _TestInfo
        else:
            self.infoclass = infoclass

    def _prepare_callback(self, test_info, target_list, verbose_str,
                          short_str):
        """
        Appends a `infoclass` to the given target list and sets a callback
        method to be called by stopTest method.
        """
        test_info.filename = self.filename
        test_info.lineno = self.lineno
        test_info.doc = self.doc
        target_list.append(test_info)

        def callback():
            """Prints the test method outcome to the stream, as well as
            the elapsed time.
            """

            test_info.test_finished()

            # Ignore the elapsed times for a more reliable unit testing
            if not self.elapsed_times:
                self.start_time = self.stop_time = 0

            if self.showAll:
                self.stream.writeln(
                    '%s (%.3fs)' % (verbose_str, test_info.elapsed_time)
                )
            elif self.dots:
                self.stream.write(short_str)

            self.stream.flush()

        self.callback = callback

    def startTest(self, test):
        """
        Called before execute each test method.
        """
        self.start_time = time()
        TestResult.startTest(self, test)

        try:
            if getattr(test, '_dt_test', None) is not None:
                # doctest.DocTestCase
                self.filename = test._dt_test.filename
                self.lineno = test._dt_test.lineno
            else:
                # regular unittest.TestCase?
                test_method = getattr(test, test._testMethodName)
                test_class = type(test)
                # Note: inspect can get confused with decorators, so use class.
                self.filename = inspect.getsourcefile(test_class)
                # Handle partial and partialmethod objects.
                test_method = getattr(test_method, 'func', test_method)
                _, self.lineno = inspect.getsourcelines(test_method)

                self.doc = test_method.__doc__
        except (AttributeError, IOError, TypeError):
            # issue #188, #189, #195
            # some frameworks can make test method opaque.
            pass

        if self.showAll:
            self.stream.write('  ' + self.getDescription(test))
            self.stream.write(" ... ")
            self.stream.flush()

    def _setupStdout(self):
        """
        Capture stdout / stderr by replacing sys.stdout / sys.stderr
        """
        super(_XMLTestResult, self)._setupStdout()
        self.__stdout_saved = sys.stdout
        sys.stdout = _DuplicateWriter(sys.stdout, self._stdout_capture)
        self.__stderr_saved = sys.stderr
        sys.stderr = _DuplicateWriter(sys.stderr, self._stderr_capture)

    def _restoreStdout(self):
        """
        Stop capturing stdout / stderr and recover sys.stdout / sys.stderr
        """
        if self.__stdout_saved:
            sys.stdout = self.__stdout_saved
            self.__stdout_saved = None
        if self.__stderr_saved:
            sys.stderr = self.__stderr_saved
            self.__stderr_saved = None
        self._stdout_capture.seek(0)
        self._stdout_capture.truncate()
        self._stderr_capture.seek(0)
        self._stderr_capture.truncate()
        super(_XMLTestResult, self)._restoreStdout()

    def _save_output_data(self):
        self._stdout_data = self._stdout_capture.getvalue()
        self._stderr_data = self._stderr_capture.getvalue()

    def stopTest(self, test):
        """
        Called after execute each test method.
        """
        self._save_output_data()
        # self._stdout_data = sys.stdout.getvalue()
        # self._stderr_data = sys.stderr.getvalue()

        TextTestResult.stopTest(self, test)
        self.stop_time = time()

        if self.callback and callable(self.callback):
            self.callback()
            self.callback = None

    def addSuccess(self, test):
        """
        Called when a test executes successfully.
        """
        self._save_output_data()
        self._prepare_callback(
            self.infoclass(self, test), self.successes, 'ok', '.'
        )

    @failfast
    def addFailure(self, test, err):
        """
        Called when a test method fails.
        """
        self._save_output_data()
        testinfo = self.infoclass(
            self, test, self.infoclass.FAILURE, err)
        self.failures.append((
            testinfo,
            self._exc_info_to_string(err, test)
        ))
        self._prepare_callback(testinfo, [], 'FAIL', 'F')

    @failfast
    def addError(self, test, err):
        """
        Called when a test method raises an error.
        """
        self._save_output_data()
        testinfo = self.infoclass(
            self, test, self.infoclass.ERROR, err)
        self.errors.append((
            testinfo,
            self._exc_info_to_string(err, test)
        ))
        self._prepare_callback(testinfo, [], 'ERROR', 'E')

    def addSubTest(self, testcase, test, err):
        """
        Called when a subTest method raises an error.
        """
        if err is not None:

            errorText = None
            errorValue = None
            errorList = None
            if issubclass(err[0], test.failureException):
                errorText = 'FAIL'
                errorValue = self.infoclass.FAILURE
                errorList = self.failures

            else:
                errorText = 'ERROR'
                errorValue = self.infoclass.ERROR
                errorList = self.errors

            self._save_output_data()

            testinfo = self.infoclass(
                self, testcase, errorValue, err, subTest=test)
            errorList.append((
                testinfo,
                self._exc_info_to_string(err, testcase)
            ))
            self._prepare_callback(testinfo, [], errorText, errorText[0])

    def addSkip(self, test, reason):
        """
        Called when a test method was skipped.
        """
        self._save_output_data()
        testinfo = self.infoclass(
            self, test, self.infoclass.SKIP, reason)
        testinfo.test_exception_name = 'skip'
        testinfo.test_exception_message = reason
        self.skipped.append((testinfo, reason))
        self._prepare_callback(testinfo, [], 'skip', 's')

    def addExpectedFailure(self, test, err):
        """
        Missing in xmlrunner, copy-pasted from xmlrunner addError.
        """
        self._save_output_data()

        testinfo = self.infoclass(self, test, self.infoclass.SKIP, err)
        testinfo.test_exception_name = 'XFAIL'
        testinfo.test_exception_message = 'expected failure: {}'.format(testinfo.test_exception_message)

        self.expectedFailures.append((testinfo, self._exc_info_to_string(err, test)))
        self._prepare_callback(testinfo, [], 'expected failure', 'x')

    @failfast
    def addUnexpectedSuccess(self, test):
        """
        Missing in xmlrunner, copy-pasted from xmlrunner addSuccess.
        """
        self._save_output_data()

        testinfo = self.infoclass(self, test)  # do not set outcome here because it will need exception
        testinfo.outcome = self.infoclass.ERROR
        # But since we want to have error outcome, we need to provide additional fields:
        testinfo.test_exception_name = 'UnexpectedSuccess'
        testinfo.test_exception_message = ('Unexpected success: This test was marked as expected failure but passed, '
                                           'please review it')

        self.unexpectedSuccesses.append((testinfo, 'unexpected success'))
        self._prepare_callback(testinfo, [], 'unexpected success', 'u')

    def printErrorList(self, flavour, errors):
        """
        Writes information about the FAIL or ERROR to the stream.
        """
        for test_info, dummy in errors:
            self.stream.writeln(self.separator1)
            self.stream.writeln(
                '%s [%.3fs]: %s' % (flavour, test_info.elapsed_time,
                                    test_info.test_description)
            )
            self.stream.writeln(self.separator2)
            self.stream.writeln('%s' % test_info.get_error_info())
            self.stream.flush()

    def _get_info_by_testcase(self):
        """
        Organizes test results by TestCase module. This information is
        used during the report generation, where a XML report will be created
        for each TestCase.
        """
        tests_by_testcase = {}

        for tests in (self.successes, self.failures, self.errors,
                      self.skipped, self.expectedFailures, self.unexpectedSuccesses):
            for test_info in tests:
                if isinstance(test_info, tuple):
                    # This is a skipped, error or a failure test case
                    test_info = test_info[0]
                testcase_name = test_info.test_name
                if testcase_name not in tests_by_testcase:
                    tests_by_testcase[testcase_name] = []
                tests_by_testcase[testcase_name].append(test_info)

        return tests_by_testcase

    def _report_testsuite_properties(xml_testsuite, xml_document, properties):
        if properties:
            xml_properties = xml_document.createElement('properties')
            xml_testsuite.appendChild(xml_properties)
            for key, value in properties.items():
                prop = xml_document.createElement('property')
                prop.setAttribute('name', str(key))
                prop.setAttribute('value', str(value))
                xml_properties.appendChild(prop)

    _report_testsuite_properties = staticmethod(_report_testsuite_properties)

    def _report_testsuite(suite_name, tests, xml_document, parentElement,
                          properties):
        """
        Appends the testsuite section to the XML document.
        """
        testsuite = xml_document.createElement('testsuite')
        parentElement.appendChild(testsuite)
        module_name = suite_name.rpartition('.')[0]
        file_name = module_name.replace('.', '/') + '.py'

        testsuite.setAttribute('name', suite_name)
        testsuite.setAttribute('tests', str(len(tests)))
        testsuite.setAttribute('file', file_name)

        testsuite.setAttribute(
            'time', '%.3f' % sum(map(lambda e: e.elapsed_time, tests))
        )
        if tests:
            testsuite.setAttribute(
                'timestamp', max(map(lambda e: e.timestamp, tests))
            )
        failures = filter(lambda e: e.outcome == e.FAILURE, tests)
        testsuite.setAttribute('failures', str(len(list(failures))))

        errors = filter(lambda e: e.outcome == e.ERROR, tests)
        testsuite.setAttribute('errors', str(len(list(errors))))

        skips = filter(lambda e: e.outcome == _TestInfo.SKIP, tests)
        testsuite.setAttribute('skipped', str(len(list(skips))))

        _XMLTestResult._report_testsuite_properties(
            testsuite, xml_document, properties)

        for test in tests:
            _XMLTestResult._report_testcase(test, testsuite, xml_document)

        return testsuite

    _report_testsuite = staticmethod(_report_testsuite)

    def _test_method_name(test_id):
        """
        Returns the test method name.
        """
        # Trick subtest referencing objects
        subtest_parts = test_id.split(' ')
        test_method_name = subtest_parts[0].split('.')[-1]
        subtest_method_name = [test_method_name] + subtest_parts[1:]
        return ' '.join(subtest_method_name)

    _test_method_name = staticmethod(_test_method_name)

    def _createCDATAsections(xmldoc, node, text):
        text = safe_unicode(text)
        pos = text.find(']]>')
        while pos >= 0:
            tmp = text[0:pos+2]
            cdata = xmldoc.createCDATASection(tmp)
            node.appendChild(cdata)
            text = text[pos+2:]
            pos = text.find(']]>')
        cdata = xmldoc.createCDATASection(text)
        node.appendChild(cdata)

    _createCDATAsections = staticmethod(_createCDATAsections)

    def _report_testcase(test_result, xml_testsuite, xml_document):
        """
        Appends a testcase section to the XML document.
        """
        testcase = xml_document.createElement('testcase')
        xml_testsuite.appendChild(testcase)

        class_name = re.sub(r'^__main__.', '', test_result.id())

        # Trick subtest referencing objects
        class_name = class_name.split(' ')[0].rpartition('.')[0]

        testcase.setAttribute('classname', class_name)
        testcase.setAttribute(
            'name', _XMLTestResult._test_method_name(test_result.test_id)
        )
        testcase.setAttribute('time', '%.3f' % test_result.elapsed_time)
        testcase.setAttribute('timestamp', test_result.timestamp)

        if test_result.filename is not None:
            # Try to make filename relative to current directory.
            filename = resolve_filename(test_result.filename)
            testcase.setAttribute('file', filename)

        if test_result.lineno is not None:
            testcase.setAttribute('line', str(test_result.lineno))

        if test_result.doc is not None:
            comment = str(test_result.doc)
            # The use of '--' is forbidden in XML comments
            comment = comment.replace('--', '&#45;&#45;')
            testcase.appendChild(xml_document.createComment(safe_unicode(comment)))

        result_elem_name = test_result.OUTCOME_ELEMENTS[test_result.outcome]

        if result_elem_name is not None:
            result_elem = xml_document.createElement(result_elem_name)
            testcase.appendChild(result_elem)

            result_elem.setAttribute(
                'type',
                test_result.test_exception_name
            )
            result_elem.setAttribute(
                'message',
                test_result.test_exception_message
            )
            if test_result.get_error_info():
                error_info = safe_unicode(test_result.get_error_info())
                _XMLTestResult._createCDATAsections(
                    xml_document, result_elem, error_info)

        if test_result.stdout:
            systemout = xml_document.createElement('system-out')
            testcase.appendChild(systemout)
            _XMLTestResult._createCDATAsections(
                xml_document, systemout, test_result.stdout)

        if test_result.stderr:
            systemout = xml_document.createElement('system-err')
            testcase.appendChild(systemout)
            _XMLTestResult._createCDATAsections(
                xml_document, systemout, test_result.stderr)

    _report_testcase = staticmethod(_report_testcase)

    def generate_reports(self, test_runner):
        """
        Generates the XML reports to a given XMLTestRunner object.
        """
        from xml.dom.minidom import Document
        all_results = self._get_info_by_testcase()

        outputHandledAsString = \
            isinstance(test_runner.output, str)

        if (outputHandledAsString and not os.path.exists(test_runner.output)):
            os.makedirs(test_runner.output)

        if not outputHandledAsString:
            doc = Document()
            testsuite = doc.createElement('testsuites')
            doc.appendChild(testsuite)
            parentElement = testsuite

        for suite, tests in all_results.items():
            if outputHandledAsString:
                doc = Document()
                parentElement = doc

            suite_name = suite
            if test_runner.outsuffix:
                # not checking with 'is not None', empty means no suffix.
                suite_name = '%s-%s' % (suite, test_runner.outsuffix)

            # Build the XML file
            testsuite = _XMLTestResult._report_testsuite(
                suite_name, tests, doc, parentElement, self.properties
            )

            if outputHandledAsString:
                xml_content = doc.toprettyxml(
                    indent='\t',
                    encoding=test_runner.encoding
                )
                filename = path.join(
                    test_runner.output,
                    'TEST-%s.xml' % suite_name)
                with open(filename, 'wb') as report_file:
                    report_file.write(xml_content)

                if self.showAll:
                    self.stream.writeln('Generated XML report: {}'.format(filename))

        if not outputHandledAsString:
            # Assume that test_runner.output is a stream
            xml_content = doc.toprettyxml(
                indent='\t',
                encoding=test_runner.encoding
            )
            test_runner.output.write(xml_content)

    def _exc_info_to_string(self, err, test):
        """Converts a sys.exc_info()-style tuple of values into a string."""
        return super(_XMLTestResult, self)._exc_info_to_string(err, test)

    def getDescription(self, test):
        if isinstance(test, tuple):
            test = test[0]
        return super().getDescription(test)


================================================
FILE: xmlrunner/runner.py
================================================

import argparse
import sys
import time

from .unittest import TextTestRunner, TestProgram
from .result import _XMLTestResult

# see issue #74, the encoding name needs to be one of
# http://www.iana.org/assignments/character-sets/character-sets.xhtml
UTF8 = 'UTF-8'


class XMLTestRunner(TextTestRunner):
    """
    A test runner class that outputs the results in JUnit like XML files.
    """

    def __init__(self, output='.', outsuffix=None,
                 elapsed_times=True, encoding=UTF8,
                 resultclass=None,
                 **kwargs):
        super(XMLTestRunner, self).__init__(**kwargs)
        self.output = output
        self.encoding = encoding
        # None means default timestamped suffix
        # '' (empty) means no suffix
        if outsuffix is None:
            outsuffix = time.strftime("%Y%m%d%H%M%S")
        self.outsuffix = outsuffix
        self.elapsed_times = elapsed_times
        if resultclass is None:
            self.resultclass = _XMLTestResult
        else:
            self.resultclass = resultclass

    def _make_result(self):
        """
        Creates a TestResult object which will be used to store
        information about the executed tests.
        """
        # override in subclasses if necessary.
        return self.resultclass(
            self.stream, self.descriptions, self.verbosity, self.elapsed_times
        )

    def run(self, test):
        """
        Runs the given test case or test suite.
        """
        try:
            # Prepare the test execution
            result = self._make_result()
            result.failfast = self.failfast
            result.buffer = self.buffer
            if hasattr(test, 'properties'):
                # junit testsuite properties
                result.properties = test.properties

            # Print a nice header
            self.stream.writeln()
            self.stream.writeln('Running tests...')
            self.stream.writeln(result.separator2)

            # Execute tests
            start_time = time.monotonic()
            test(result)
            stop_time = time.monotonic()
            time_taken = stop_time - start_time

            # Print results
            result.printErrors()
            self.stream.writeln(result.separator2)
            run = result.testsRun
            self.stream.writeln("Ran %d test%s in %.3fs" % (
                run, run != 1 and "s" or "", time_taken)
            )
            self.stream.writeln()

            # other metrics
            expectedFails = len(result.expectedFailures)
            unexpectedSuccesses = len(result.unexpectedSuccesses)
            skipped = len(result.skipped)

            # Error traces
            infos = []
            if not result.wasSuccessful():
                self.stream.write("FAILED")
                failed, errored = map(len, (result.failures, result.errors))
                if failed:
                    infos.append("failures={0}".format(failed))
                if errored:
                    infos.append("errors={0}".format(errored))
            else:
                self.stream.write("OK")

            if skipped:
                infos.append("skipped={0}".format(skipped))
            if expectedFails:
                infos.append("expected failures={0}".format(expectedFails))
            if unexpectedSuccesses:
                infos.append("unexpected successes={0}".format(
                    unexpectedSuccesses))

            if infos:
                self.stream.writeln(" ({0})".format(", ".join(infos)))
            else:
                self.stream.write("\n")

            # Generate reports
            self.stream.writeln()
            self.stream.writeln('Generating XML reports...')
            result.generate_reports(self)
        finally:
            pass

        return result


class XMLTestProgram(TestProgram):

    def __init__(self, *args, **kwargs):
        kwargs.setdefault('testRunner', XMLTestRunner)
        self.warnings = None  # python2 fix
        self._parseKnownArgs(kwargs)
        super(XMLTestProgram, self).__init__(*args, **kwargs)

    def _parseKnownArgs(self, kwargs):
        argv = kwargs.get('argv')
        if argv is None:
            argv = sys.argv

        # python2 argparse fix
        parser = argparse.ArgumentParser(prog='xmlrunner')
        group = parser.add_mutually_exclusive_group()
        group.add_argument(
            '-o', '--output', metavar='DIR',
            help='Directory for storing XML reports (\'.\' default)')
        group.add_argument(
            '--output-file', metavar='FILENAME',
            help='Filename for storing XML report')
        parser.add_argument(
            '--outsuffix', metavar='STRING',
            help='Output suffix (timestamp is default)')
        namespace, argv = parser.parse_known_args(argv)
        self.output = namespace.output
        self.output_file = namespace.output_file
        self.outsuffix = namespace.outsuffix
        kwargs['argv'] = argv

    def _initArgParsers(self):
        # this code path is only called in python3 (optparse vs argparse)
        super(XMLTestProgram, self)._initArgParsers()

        for parser in (self._main_parser, self._discovery_parser):
            group = parser.add_mutually_exclusive_group()
            group.add_argument(
                '-o', '--output', metavar='DIR', nargs=1,
                help='Directory for storing XML reports (\'.\' default)')
            group.add_argument(
                '--output-file', metavar='FILENAME', nargs=1,
                help='Filename for storing XML report')
            group.add_argument(
                '--outsuffix', metavar='STRING', nargs=1,
                help='Output suffix (timestamp is default)')

    def runTests(self):
        kwargs = dict(
            verbosity=self.verbosity,
            failfast=self.failfast,
            buffer=self.buffer,
            warnings=self.warnings,
        )
        if sys.version_info[:2] > (3, 4):
            kwargs.update(tb_locals=self.tb_locals)

        output_file = None
        try:
            if self.output_file is not None:
                output_file = open(self.output_file, 'wb')
                kwargs.update(output=output_file)
            elif self.output is not None:
                kwargs.update(output=self.output)

            if self.outsuffix is not None:
                kwargs.update(outsuffix=self.outsuffix)

            self.testRunner = self.testRunner(**kwargs)
            super(XMLTestProgram, self).runTests()
        finally:
            if output_file is not None:
                output_file.close()


================================================
FILE: xmlrunner/unittest.py
================================================

from __future__ import absolute_import

import sys
# pylint: disable-msg=W0611
import unittest
from unittest import TextTestRunner
from unittest import TestResult, TextTestResult
from unittest.result import failfast
from unittest.main import TestProgram


__all__ = (
    'unittest', 'TextTestRunner', 'TestResult', 'TextTestResult',
    'TestProgram', 'failfast')
Download .txt
gitextract_u1gfj0s8/

├── .coveragerc
├── .github/
│   ├── ISSUE_TEMPLATE/
│   │   ├── bug_report.md
│   │   └── feature_request.md
│   ├── pull_request_template.md
│   └── workflows/
│       └── tests.yml
├── .gitignore
├── .landscape.yml
├── LICENSE
├── MANIFEST.in
├── Makefile
├── README.md
├── docs/
│   ├── Makefile
│   ├── conf.py
│   └── index.rst
├── pyproject.toml
├── tests/
│   ├── __init__.py
│   ├── builder_test.py
│   ├── discovery_test.py
│   ├── django_example/
│   │   ├── app/
│   │   │   ├── __init__.py
│   │   │   ├── admin.py
│   │   │   ├── migrations/
│   │   │   │   └── __init__.py
│   │   │   ├── models.py
│   │   │   ├── tests.py
│   │   │   └── views.py
│   │   ├── app2/
│   │   │   ├── __init__.py
│   │   │   ├── admin.py
│   │   │   ├── migrations/
│   │   │   │   └── __init__.py
│   │   │   ├── models.py
│   │   │   ├── tests.py
│   │   │   └── views.py
│   │   ├── example/
│   │   │   ├── __init__.py
│   │   │   ├── settings.py
│   │   │   ├── urls.py
│   │   │   └── wsgi.py
│   │   └── manage.py
│   ├── django_test.py
│   ├── doctest_example.py
│   ├── testsuite.py
│   └── vendor/
│       └── jenkins/
│           └── xunit-plugin/
│               ├── 14c6e39c38408b9ed6280361484a13c6f5becca7/
│               │   └── junit-10.xsd
│               └── ae25da5089d4f94ac6c4669bf736e4d416cc4665/
│                   └── junit-10.xsd
├── tox.ini
└── xmlrunner/
    ├── __init__.py
    ├── __main__.py
    ├── builder.py
    ├── extra/
    │   ├── __init__.py
    │   ├── djangotestrunner.py
    │   └── xunit_plugin.py
    ├── result.py
    ├── runner.py
    └── unittest.py
Download .txt
SYMBOL INDEX (236 symbols across 12 files)

FILE: tests/builder_test.py
  class TestXMLContextTest (line 11) | class TestXMLContextTest(unittest.TestCase):
    method setUp (line 15) | def setUp(self):
    method test_current_element_tag_name (line 19) | def test_current_element_tag_name(self):
    method test_current_context_name (line 23) | def test_current_context_name(self):
    method test_current_context_invalid_unicode_name (line 28) | def test_current_context_invalid_unicode_name(self):
    method test_increment_valid_testsuites_counters (line 33) | def test_increment_valid_testsuites_counters(self):
    method test_increment_valid_testsuite_counters (line 48) | def test_increment_valid_testsuite_counters(self):
    method test_increment_counters_for_unknown_context (line 60) | def test_increment_counters_for_unknown_context(self):
    method test_empty_counters_on_end_context (line 72) | def test_empty_counters_on_end_context(self):
    method test_add_time_attribute_on_end_context (line 79) | def test_add_time_attribute_on_end_context(self):
    method test_add_timestamp_attribute_on_end_context (line 85) | def test_add_timestamp_attribute_on_end_context(self):
  class TestXMLBuilderTest (line 92) | class TestXMLBuilderTest(unittest.TestCase):
    method setUp (line 96) | def setUp(self):
    method test_root_has_no_parent (line 106) | def test_root_has_no_parent(self):
    method test_current_context_tag (line 109) | def test_current_context_tag(self):
    method test_begin_nested_context (line 112) | def test_begin_nested_context(self):
    method test_end_inexistent_context (line 120) | def test_end_inexistent_context(self):
    method test_end_root_context (line 126) | def test_end_root_context(self):
    method test_end_nested_context (line 141) | def test_end_nested_context(self):
    method test_end_all_context_stack (line 150) | def test_end_all_context_stack(self):
    method test_append_valid_unicode_cdata_section (line 172) | def test_append_valid_unicode_cdata_section(self):
    method test_append_invalid_unicode_cdata_section (line 184) | def test_append_invalid_unicode_cdata_section(self):
    method test_append_cdata_closing_tags_into_cdata_section (line 194) | def test_append_cdata_closing_tags_into_cdata_section(self):
    method test_append_tag_with_valid_unicode_values (line 203) | def test_append_tag_with_valid_unicode_values(self):
    method test_append_tag_with_invalid_unicode_values (line 214) | def test_append_tag_with_invalid_unicode_values(self):
    method test_increment_root_context_counter (line 225) | def test_increment_root_context_counter(self):
    method test_increment_nested_context_counter (line 234) | def test_increment_nested_context_counter(self):
    method test_finish_nested_context (line 250) | def test_finish_nested_context(self):

FILE: tests/discovery_test.py
  class DiscoveryTest (line 4) | class DiscoveryTest(unittest.TestCase):
    method test_discovery_pass (line 5) | def test_discovery_pass(self):

FILE: tests/django_example/app/tests.py
  class DummyTestCase (line 5) | class DummyTestCase(TestCase):
    method test_pass (line 6) | def test_pass(self):
    method test_negative_comment1 (line 10) | def test_negative_comment1(self):
    method test_negative_comment2 (line 14) | def test_negative_comment2(self):

FILE: tests/django_example/app2/tests.py
  class DummyTestCase (line 5) | class DummyTestCase(TestCase):
    method test_pass (line 6) | def test_pass(self):

FILE: tests/django_test.py
  class DjangoTest (line 26) | class DjangoTest(unittest.TestCase):
    method setUp (line 28) | def setUp(self):
    method tearDown (line 45) | def tearDown(self):
    method _override_settings (line 50) | def _override_settings(self, **kwargs):
    method _check_runner (line 55) | def _check_runner(self, runner):
    method test_django_runner (line 73) | def test_django_runner(self):
    method test_django_xmlrunner (line 78) | def test_django_xmlrunner(self):
    method test_django_verbose (line 85) | def test_django_verbose(self):
    method test_django_single_report (line 93) | def test_django_single_report(self):
    method test_django_single_report_create_folder (line 108) | def test_django_single_report_create_folder(self):
    method test_django_multiple_reports (line 125) | def test_django_multiple_reports(self):
    method test_django_runner_extension (line 140) | def test_django_runner_extension(self):

FILE: tests/doctest_example.py
  function twice (line 2) | def twice(n):
  class Multiplicator (line 10) | class Multiplicator(object):
    method threetimes (line 11) | def threetimes(self, n):

FILE: tests/testsuite.py
  function _load_schema (line 31) | def _load_schema(version):
  function validate_junit_report (line 42) | def validate_junit_report(version, text):
  class DoctestTest (line 48) | class DoctestTest(unittest.TestCase):
    method test_doctest_example (line 50) | def test_doctest_example(self):
  function capture_stdout_stderr (line 66) | def capture_stdout_stderr():
  function _strip_xml (line 81) | def _strip_xml(xml, changes):
  function some_decorator (line 91) | def some_decorator(f):
  class XMLTestRunnerTestCase (line 102) | class XMLTestRunnerTestCase(unittest.TestCase):
    class DummyTest (line 106) | class DummyTest(unittest.TestCase):
      method test_skip (line 109) | def test_skip(self):
      method test_non_ascii_skip (line 113) | def test_non_ascii_skip(self):
      method test_pass (line 116) | def test_pass(self):
      method test_fail (line 119) | def test_fail(self):
      method test_expected_failure (line 123) | def test_expected_failure(self):
      method test_unexpected_success (line 127) | def test_unexpected_success(self):
      method test_error (line 130) | def test_error(self):
      method test_cdata_section (line 133) | def test_cdata_section(self):
      method test_invalid_xml_chars_in_doc (line 136) | def test_invalid_xml_chars_in_doc(self):
      method test_non_ascii_error (line 142) | def test_non_ascii_error(self):
      method test_unsafe_unicode (line 145) | def test_unsafe_unicode(self):
      method test_output_stdout_and_stderr (line 148) | def test_output_stdout_and_stderr(self):
      method test_runner_buffer_output_pass (line 152) | def test_runner_buffer_output_pass(self):
      method test_runner_buffer_output_fail (line 155) | def test_runner_buffer_output_fail(self):
      method test_output (line 159) | def test_output(self):
      method test_non_ascii_runner_buffer_output_fail (line 162) | def test_non_ascii_runner_buffer_output_fail(self):
    class DummySubTest (line 166) | class DummySubTest(unittest.TestCase):
      method test_subTest_pass (line 168) | def test_subTest_pass(self):
      method test_subTest_fail (line 173) | def test_subTest_fail(self):
      method test_subTest_error (line 178) | def test_subTest_error(self):
      method test_subTest_mixed (line 183) | def test_subTest_mixed(self):
      method test_subTest_with_dots (line 188) | def test_subTest_with_dots(self):
    class DecoratedUnitTest (line 193) | class DecoratedUnitTest(unittest.TestCase):
      method test_pass (line 196) | def test_pass(self):
    class DummyErrorInCallTest (line 199) | class DummyErrorInCallTest(unittest.TestCase):
      method __call__ (line 201) | def __call__(self, result):
      method test_pass (line 208) | def test_pass(self):
    class DummyRefCountTest (line 212) | class DummyRefCountTest(unittest.TestCase):
      class dummy (line 213) | class dummy(object):
      method test_fail (line 215) | def test_fail(self):
    method setUp (line 219) | def setUp(self):
    method _test_xmlrunner (line 226) | def _test_xmlrunner(self, suite, runner=None, outdir=None):
    method test_basic_unittest_constructs (line 247) | def test_basic_unittest_constructs(self):
    method test_classnames (line 257) | def test_classnames(self):
    method test_expected_failure (line 284) | def test_expected_failure(self):
    method test_unexpected_success (line 295) | def test_unexpected_success(self):
    method test_xmlrunner_non_ascii (line 306) | def test_xmlrunner_non_ascii(self):
    method test_xmlrunner_safe_xml_encoding_name (line 321) | def test_xmlrunner_safe_xml_encoding_name(self):
    method test_xmlrunner_check_for_valid_xml_streamout (line 335) | def test_xmlrunner_check_for_valid_xml_streamout(self):
    method test_xmlrunner_unsafe_unicode (line 367) | def test_xmlrunner_unsafe_unicode(self):
    method test_xmlrunner_non_ascii_failures (line 380) | def test_xmlrunner_non_ascii_failures(self):
    method test_xmlrunner_non_ascii_failures_buffered_output (line 383) | def test_xmlrunner_non_ascii_failures_buffered_output(self):
    method _xmlrunner_non_ascii_failures (line 386) | def _xmlrunner_non_ascii_failures(self, buffer=False):
    method test_xmlrunner_buffer_output_pass (line 415) | def test_xmlrunner_buffer_output_pass(self):
    method test_xmlrunner_buffer_output_fail (line 427) | def test_xmlrunner_buffer_output_fail(self):
    method test_xmlrunner_output_without_buffer (line 436) | def test_xmlrunner_output_without_buffer(self):
    method test_xmlrunner_output_with_buffer (line 444) | def test_xmlrunner_output_with_buffer(self):
    method test_xmlrunner_stdout_stderr_recovered_without_buffer (line 454) | def test_xmlrunner_stdout_stderr_recovered_without_buffer(self):
    method test_xmlrunner_stdout_stderr_recovered_with_buffer (line 463) | def test_xmlrunner_stdout_stderr_recovered_with_buffer(self):
    method test_unittest_subTest_fail (line 478) | def test_unittest_subTest_fail(self):
    method test_unittest_subTest_error (line 507) | def test_unittest_subTest_error(self):
    method test_unittest_subTest_mixed (line 537) | def test_unittest_subTest_mixed(self):
    method test_unittest_subTest_pass (line 562) | def test_unittest_subTest_pass(self):
    method test_unittest_subTest_with_dots (line 570) | def test_unittest_subTest_with_dots(self):
    method test_xmlrunner_pass (line 589) | def test_xmlrunner_pass(self):
    method test_xmlrunner_failfast (line 594) | def test_xmlrunner_failfast(self):
    method test_xmlrunner_verbose (line 609) | def test_xmlrunner_verbose(self):
    method test_xmlrunner_showall (line 615) | def test_xmlrunner_showall(self):
    method test_xmlrunner_cdata_section (line 621) | def test_xmlrunner_cdata_section(self):
    method test_xmlrunner_invalid_xml_chars_in_doc (line 626) | def test_xmlrunner_invalid_xml_chars_in_doc(self):
    method test_xmlrunner_outsuffix (line 643) | def test_xmlrunner_outsuffix(self):
    method test_xmlrunner_nosuffix (line 651) | def test_xmlrunner_nosuffix(self):
    method test_junitxml_properties (line 660) | def test_junitxml_properties(self):
    method test_junitxml_xsd_validation_order (line 666) | def test_junitxml_xsd_validation_order(self):
    method test_junitxml_xsd_validation_empty_properties (line 689) | def test_junitxml_xsd_validation_empty_properties(self):
    method test_xunit_plugin_transform (line 706) | def test_xunit_plugin_transform(self):
    method test_xmlrunner_elapsed_times (line 730) | def test_xmlrunner_elapsed_times(self):
    method test_xmlrunner_resultclass (line 736) | def test_xmlrunner_resultclass(self):
    method test_xmlrunner_stream (line 745) | def test_xmlrunner_stream(self):
    method test_xmlrunner_stream_empty_testsuite (line 755) | def test_xmlrunner_stream_empty_testsuite(self):
    method test_xmlrunner_output_subdir (line 764) | def test_xmlrunner_output_subdir(self):
    method test_xmlrunner_patched_stdout (line 774) | def test_xmlrunner_patched_stdout(self):
    method test_opaque_decorator (line 785) | def test_opaque_decorator(self):
    method test_xmlrunner_error_in_call (line 792) | def test_xmlrunner_error_in_call(self):
    method test_xmlrunner_hold_traceback (line 806) | def test_xmlrunner_hold_traceback(self):
    class StderrXMLTestRunner (line 818) | class StderrXMLTestRunner(xmlrunner.XMLTestRunner):
      method __init__ (line 827) | def __init__(self, **kwargs):
    method test_test_program_succeed_with_buffer (line 833) | def test_test_program_succeed_with_buffer(self):
    method test_test_program_succeed_wo_buffer (line 848) | def test_test_program_succeed_wo_buffer(self):
    method test_test_program_fail_with_buffer (line 862) | def test_test_program_fail_with_buffer(self):
    method test_test_program_fail_wo_buffer (line 877) | def test_test_program_fail_wo_buffer(self):
    method test_partialmethod (line 891) | def test_partialmethod(self):
  class DuplicateWriterTestCase (line 908) | class DuplicateWriterTestCase(unittest.TestCase):
    method setUp (line 909) | def setUp(self):
    method tearDown (line 915) | def tearDown(self):
    method getFirstContent (line 920) | def getFirstContent(self):
    method getSecondContent (line 924) | def getSecondContent(self):
    method test_flush (line 927) | def test_flush(self):
    method test_writable (line 932) | def test_writable(self):
    method test_writelines (line 935) | def test_writelines(self):
    method test_write (line 944) | def test_write(self):
  class XMLProgramTestCase (line 953) | class XMLProgramTestCase(unittest.TestCase):
    method test_xmlrunner_output (line 957) | def test_xmlrunner_output(self, exiter, testrunner):
    method test_xmlrunner_output_file (line 978) | def test_xmlrunner_output_file(self, exiter, testrunner, opener):
    method test_xmlrunner_outsuffix (line 1002) | def test_xmlrunner_outsuffix(self, exiter, testrunner, opener):
  class ResolveFilenameTestCase (line 1020) | class ResolveFilenameTestCase(unittest.TestCase):
    method test_resolve_filename_relative (line 1022) | def test_resolve_filename_relative(self, relpath):
    method test_resolve_filename_outside (line 1028) | def test_resolve_filename_outside(self, relpath):
    method test_resolve_filename_error (line 1034) | def test_resolve_filename_error(self, relpath):

FILE: xmlrunner/builder.py
  function replace_nontext (line 33) | def replace_nontext(text, replacement=u'\uFFFD'):
  class TestXMLContext (line 37) | class TestXMLContext(object):
    method __init__ (line 55) | def __init__(self, xml_doc, parent_context=None):
    method element_tag (line 66) | def element_tag(self):
    method begin (line 71) | def begin(self, tag, name):
    method end (line 79) | def end(self):
    method _set_result_counters (line 92) | def _set_result_counters(self):
    method increment_counter (line 116) | def increment_counter(self, counter_name):
    method elapsed_time (line 124) | def elapsed_time(self):
    method timestamp (line 130) | def timestamp(self):
  class TestXMLBuilder (line 136) | class TestXMLBuilder(object):
    method __init__ (line 141) | def __init__(self):
    method current_context (line 147) | def current_context(self):
    method begin_context (line 152) | def begin_context(self, tag, name):
    method context_tag (line 161) | def context_tag(self):
    method _create_cdata_section (line 166) | def _create_cdata_section(self, content):
    method append_cdata_section (line 173) | def append_cdata_section(self, tag, content):
    method append (line 191) | def append(self, tag, content, **kwargs):
    method _append_child (line 208) | def _append_child(self, element):
    method increment_counter (line 217) | def increment_counter(self, counter_name):
    method end_context (line 226) | def end_context(self):
    method finish (line 241) | def finish(self):

FILE: xmlrunner/extra/djangotestrunner.py
  class XMLTestRunner (line 19) | class XMLTestRunner(DiscoverRunner):
    method get_resultclass (line 22) | def get_resultclass(self):
    method get_test_runner_kwargs (line 27) | def get_test_runner_kwargs(self):
    method run_suite (line 55) | def run_suite(self, suite, **kwargs):

FILE: xmlrunner/extra/xunit_plugin.py
  function transform (line 24) | def transform(xml_data):

FILE: xmlrunner/result.py
  function safe_unicode (line 51) | def safe_unicode(data, encoding='utf8'):
  function testcase_name (line 61) | def testcase_name(test_method):
  function resolve_filename (line 72) | def resolve_filename(filename):
  class _DuplicateWriter (line 82) | class _DuplicateWriter(io.TextIOBase):
    method __init__ (line 89) | def __init__(self, first, second):
    method flush (line 94) | def flush(self):
    method writable (line 104) | def writable(self):
    method getvalue (line 107) | def getvalue(self):
    method writelines (line 110) | def writelines(self, lines):
    method write (line 114) | def write(self, b):
  class _TestInfo (line 130) | class _TestInfo(object):
    method __init__ (line 146) | def __init__(self, test_result, test_method, outcome=SUCCESS, err=None...
    method id (line 179) | def id(self):
    method test_finished (line 182) | def test_finished(self):
    method get_error_info (line 190) | def get_error_info(self):
    method shortDescription (line 197) | def shortDescription(self):
  class _XMLTestResult (line 201) | class _XMLTestResult(TextTestResult):
    method __init__ (line 207) | def __init__(self, stream=sys.stderr, descriptions=1, verbosity=1,
    method _prepare_callback (line 228) | def _prepare_callback(self, test_info, target_list, verbose_str,
    method startTest (line 261) | def startTest(self, test):
    method _setupStdout (line 294) | def _setupStdout(self):
    method _restoreStdout (line 304) | def _restoreStdout(self):
    method _save_output_data (line 320) | def _save_output_data(self):
    method stopTest (line 324) | def stopTest(self, test):
    method addSuccess (line 339) | def addSuccess(self, test):
    method addFailure (line 349) | def addFailure(self, test, err):
    method addError (line 363) | def addError(self, test, err):
    method addSubTest (line 376) | def addSubTest(self, testcase, test, err):
    method addSkip (line 405) | def addSkip(self, test, reason):
    method addExpectedFailure (line 417) | def addExpectedFailure(self, test, err):
    method addUnexpectedSuccess (line 431) | def addUnexpectedSuccess(self, test):
    method printErrorList (line 447) | def printErrorList(self, flavour, errors):
    method _get_info_by_testcase (line 461) | def _get_info_by_testcase(self):
    method _report_testsuite_properties (line 482) | def _report_testsuite_properties(xml_testsuite, xml_document, properti...
    method _report_testsuite (line 494) | def _report_testsuite(suite_name, tests, xml_document, parentElement,
    method _test_method_name (line 534) | def _test_method_name(test_id):
    method _createCDATAsections (line 546) | def _createCDATAsections(xmldoc, node, text):
    method _report_testcase (line 560) | def _report_testcase(test_result, xml_testsuite, xml_document):
    method generate_reports (line 626) | def generate_reports(self, test_runner):
    method _exc_info_to_string (line 682) | def _exc_info_to_string(self, err, test):
    method getDescription (line 686) | def getDescription(self, test):

FILE: xmlrunner/runner.py
  class XMLTestRunner (line 14) | class XMLTestRunner(TextTestRunner):
    method __init__ (line 19) | def __init__(self, output='.', outsuffix=None,
    method _make_result (line 37) | def _make_result(self):
    method run (line 47) | def run(self, test):
  class XMLTestProgram (line 120) | class XMLTestProgram(TestProgram):
    method __init__ (line 122) | def __init__(self, *args, **kwargs):
    method _parseKnownArgs (line 128) | def _parseKnownArgs(self, kwargs):
    method _initArgParsers (line 151) | def _initArgParsers(self):
    method runTests (line 167) | def runTests(self):
Condensed preview — 50 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (157K chars).
[
  {
    "path": ".coveragerc",
    "chars": 43,
    "preview": "[report]\ninclude =\n  tests/*\n  xmlrunner/*\n"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/bug_report.md",
    "chars": 864,
    "preview": "<!--\nThank you for your contribution and taking the time to report the issue!\nNote: Please search to see if an issue alr"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/feature_request.md",
    "chars": 1164,
    "preview": "<!--\nThank you for your contribution and taking the time to report the issue!\nNote: Please search to see if an issue alr"
  },
  {
    "path": ".github/pull_request_template.md",
    "chars": 485,
    "preview": "<!--\nThank you for submitting a PR and your contribution!\n\nQuick checklist\n\n- [ ] Include unit tests when applicable\n- ["
  },
  {
    "path": ".github/workflows/tests.yml",
    "chars": 1588,
    "preview": "name: Tests\n\non:\n  push:\n    branches:\n      - master\n  pull_request:\n\njobs:\n  build:\n    runs-on: ubuntu-latest\n    str"
  },
  {
    "path": ".gitignore",
    "chars": 171,
    "preview": "# Python bytecode\n*.pyc\n\n# Build directory\nbuild/*\ndist/*\n\n# Egg info directory\n*.egg-info\n\n# tox + coverage\n.tox\n.cover"
  },
  {
    "path": ".landscape.yml",
    "chars": 126,
    "preview": "doc-warnings: true\ntest-warnings: false\nstrictness: veryhigh\nmax-line-length: 80\nautodetect: true\npython-targets:\n  - 2\n"
  },
  {
    "path": "LICENSE",
    "chars": 1532,
    "preview": "Copyright (c) 2008-2013, Daniel Fernandes Martins\nAll rights reserved.\n\nRedistribution and use in source and binary form"
  },
  {
    "path": "MANIFEST.in",
    "chars": 33,
    "preview": "include README.md\ninclude LICENSE"
  },
  {
    "path": "Makefile",
    "chars": 639,
    "preview": "\nbuild/tox/bin:\n\tpython3 -m venv build/tox\n\tbuild/tox/bin/pip install tox\n\nbuild/publish/bin:\n\tpython3 -m venv build/pub"
  },
  {
    "path": "README.md",
    "chars": 9982,
    "preview": "[![License](https://img.shields.io/pypi/l/unittest-xml-reporting.svg)](https://pypi.python.org/pypi/unittest-xml-reporti"
  },
  {
    "path": "docs/Makefile",
    "chars": 7670,
    "preview": "# Makefile for Sphinx documentation\n#\n\n# You can set these variables from the command line.\nSPHINXOPTS    =\nSPHINXBUILD "
  },
  {
    "path": "docs/conf.py",
    "chars": 9870,
    "preview": "# -*- coding: utf-8 -*-\n#\n# unittest-xml-reporting documentation build configuration file, created by\n# sphinx-quickstar"
  },
  {
    "path": "docs/index.rst",
    "chars": 1114,
    "preview": "unittest-xml-reporting\n======================\n\n``unittest-xml-reporting`` is a ``unittest`` test runner that can save\nte"
  },
  {
    "path": "pyproject.toml",
    "chars": 1547,
    "preview": "[build-system]\nbuild-backend = \"setuptools.build_meta\"\nrequires = [\n    \"setuptools>=77\",\n    \"setuptools-scm[toml]>=6.2"
  },
  {
    "path": "tests/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "tests/builder_test.py",
    "chars": 8645,
    "preview": "# -*- coding: utf-8\n\nfrom xmlrunner.unittest import unittest\n\nimport xml.etree.ElementTree as ET\nfrom xml.dom.minidom im"
  },
  {
    "path": "tests/discovery_test.py",
    "chars": 106,
    "preview": "import unittest\n\n\nclass DiscoveryTest(unittest.TestCase):\n    def test_discovery_pass(self):\n        pass\n"
  },
  {
    "path": "tests/django_example/app/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "tests/django_example/app/admin.py",
    "chars": 71,
    "preview": "from django.contrib import admin  # NOQA\n\n# Register your models here.\n"
  },
  {
    "path": "tests/django_example/app/migrations/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "tests/django_example/app/models.py",
    "chars": 65,
    "preview": "from django.db import models  # NOQA\n\n# Create your models here.\n"
  },
  {
    "path": "tests/django_example/app/tests.py",
    "chars": 346,
    "preview": "from django.test import TestCase\n\n\n# Create your tests here.\nclass DummyTestCase(TestCase):\n    def test_pass(self):\n   "
  },
  {
    "path": "tests/django_example/app/views.py",
    "chars": 71,
    "preview": "from django.shortcuts import render  # NOQA\n\n# Create your views here.\n"
  },
  {
    "path": "tests/django_example/app2/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "tests/django_example/app2/admin.py",
    "chars": 71,
    "preview": "from django.contrib import admin  # NOQA\n\n# Register your models here.\n"
  },
  {
    "path": "tests/django_example/app2/migrations/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "tests/django_example/app2/models.py",
    "chars": 65,
    "preview": "from django.db import models  # NOQA\n\n# Create your models here.\n"
  },
  {
    "path": "tests/django_example/app2/tests.py",
    "chars": 130,
    "preview": "from django.test import TestCase\n\n\n# Create your tests here.\nclass DummyTestCase(TestCase):\n    def test_pass(self):\n   "
  },
  {
    "path": "tests/django_example/app2/views.py",
    "chars": 71,
    "preview": "from django.shortcuts import render  # NOQA\n\n# Create your views here.\n"
  },
  {
    "path": "tests/django_example/example/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "tests/django_example/example/settings.py",
    "chars": 752,
    "preview": "\nimport os\n\n\nBASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))\nSECRET_KEY = 'not-a-secret'\nDEBUG = "
  },
  {
    "path": "tests/django_example/example/urls.py",
    "chars": 125,
    "preview": "\nfrom django.conf.urls import url\nfrom django.contrib import admin\n\n\nurlpatterns = [\n    url(r'^admin/', admin.site.urls"
  },
  {
    "path": "tests/django_example/example/wsgi.py",
    "chars": 168,
    "preview": "\nimport os\nfrom django.core.wsgi import get_wsgi_application\n\n\nos.environ.setdefault(\"DJANGO_SETTINGS_MODULE\", \"example."
  },
  {
    "path": "tests/django_example/manage.py",
    "chars": 249,
    "preview": "#!/usr/bin/env python\nimport os\nimport sys\n\n\nif __name__ == \"__main__\":\n    os.environ.setdefault(\"DJANGO_SETTINGS_MODUL"
  },
  {
    "path": "tests/django_test.py",
    "chars": 5747,
    "preview": "from xmlrunner.unittest import unittest\n\nimport sys\nimport os\nfrom os import path\nimport glob\nfrom unittest import mock\n"
  },
  {
    "path": "tests/doctest_example.py",
    "chars": 230,
    "preview": "\ndef twice(n):\n    \"\"\"\n    >>> twice(5)\n    10\n    \"\"\"\n    return 2 * n\n\n\nclass Multiplicator(object):\n    def threetime"
  },
  {
    "path": "tests/testsuite.py",
    "chars": 37362,
    "preview": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\n\"\"\"Executable module to test unittest-xml-reporting.\n\"\"\"\nfrom __future__ "
  },
  {
    "path": "tests/vendor/jenkins/xunit-plugin/14c6e39c38408b9ed6280361484a13c6f5becca7/junit-10.xsd",
    "chars": 6314,
    "preview": "<?xml version=\"1.0\" encoding=\"UTF-8\" ?>\n<!--\nThe MIT License (MIT)\n\nCopyright (c) 2014, Gregory Boissinot\n\nPermission is"
  },
  {
    "path": "tests/vendor/jenkins/xunit-plugin/ae25da5089d4f94ac6c4669bf736e4d416cc4665/junit-10.xsd",
    "chars": 6555,
    "preview": "<?xml version=\"1.0\" encoding=\"UTF-8\" ?>\n<!--\nThe MIT License (MIT)\n\nCopyright (c) 2014, Gregory Boissinot\n\nPermission is"
  },
  {
    "path": "tox.ini",
    "chars": 1415,
    "preview": "[pytest]\npython_files = *_test.py test*.py\ntestpaths = tests\nnorecursedirs = tests/django_example\n\n[tox]\nenvlist = begin"
  },
  {
    "path": "xmlrunner/__init__.py",
    "chars": 346,
    "preview": "# -*- coding: utf-8 -*-\n\n\"\"\"\nThis module provides the XMLTestRunner class, which is heavily based on the\ndefault TextTes"
  },
  {
    "path": "xmlrunner/__main__.py",
    "chars": 496,
    "preview": "\"\"\"Main entry point\"\"\"\n\nimport sys\nfrom .runner import XMLTestProgram\n\nif sys.argv[0].endswith(\"__main__.py\"):\n    impor"
  },
  {
    "path": "xmlrunner/builder.py",
    "chars": 8271,
    "preview": "import re\nimport sys\nimport datetime\nimport time\n\nfrom xml.dom.minidom import Document\n\n\n__all__ = ('TestXMLBuilder', 'T"
  },
  {
    "path": "xmlrunner/extra/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "xmlrunner/extra/djangotestrunner.py",
    "chars": 2168,
    "preview": "# -*- coding: utf-8 -*-\n\n\"\"\"\nCustom Django test runner that runs the tests using the\nXMLTestRunner class.\n\nThis script s"
  },
  {
    "path": "xmlrunner/extra/xunit_plugin.py",
    "chars": 796,
    "preview": "import io\nimport lxml.etree as etree\n\n\nTRANSFORM = etree.XSLT(etree.XML(b'''\\\n<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<xs"
  },
  {
    "path": "xmlrunner/result.py",
    "chars": 23732,
    "preview": "\nimport inspect\nimport io\nimport os\nimport sys\nimport datetime\nimport traceback\nimport re\nfrom os import path\nfrom io im"
  },
  {
    "path": "xmlrunner/runner.py",
    "chars": 6644,
    "preview": "\nimport argparse\nimport sys\nimport time\n\nfrom .unittest import TextTestRunner, TestProgram\nfrom .result import _XMLTestR"
  },
  {
    "path": "xmlrunner/unittest.py",
    "chars": 366,
    "preview": "\nfrom __future__ import absolute_import\n\nimport sys\n# pylint: disable-msg=W0611\nimport unittest\nfrom unittest import Tex"
  }
]

About this extraction

This page contains the full source code of the xmlrunner/unittest-xml-reporting GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 50 files (144.7 KB), approximately 36.5k tokens, and a symbol index with 236 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!