Full Code of BoldBrowser/bold-browser for AI

master 339a5de2b960 cached
162 files
3.5 MB
915.9k tokens
245 symbols
1 requests
Download .txt
Showing preview only (3,663K chars total). Download the full file or copy to clipboard to get everything.
Repository: BoldBrowser/bold-browser
Branch: master
Commit: 339a5de2b960
Files: 162
Total size: 3.5 MB

Directory structure:
gitextract_xa9r_bzi/

├── .cirrus.yml
├── .cirrus_Dockerfile
├── .cirrus_requirements.txt
├── .github/
│   ├── ISSUE_TEMPLATE/
│   │   ├── bugreport.md
│   │   ├── create-an--updating-to-chromium-x-x-x-x-.md
│   │   ├── feature_request.md
│   │   └── other.md
│   └── PULL_REQUEST_TEMPLATE.md
├── .gitignore
├── .style.yapf
├── LICENSE
├── README.md
├── SUPPORT.md
├── chromium_version.txt
├── devutils/
│   ├── .coveragerc
│   ├── README.md
│   ├── __init__.py
│   ├── check_all_code.sh
│   ├── check_downloads_ini.py
│   ├── check_files_exist.py
│   ├── check_gn_flags.py
│   ├── check_patch_files.py
│   ├── print_tag_version.sh
│   ├── pytest.ini
│   ├── run_devutils_pylint.py
│   ├── run_devutils_tests.sh
│   ├── run_devutils_yapf.sh
│   ├── run_other_pylint.py
│   ├── run_other_yapf.sh
│   ├── run_utils_pylint.py
│   ├── run_utils_tests.sh
│   ├── run_utils_yapf.sh
│   ├── set_quilt_vars.sh
│   ├── tests/
│   │   ├── __init__.py
│   │   ├── test_check_patch_files.py
│   │   └── test_validate_patches.py
│   ├── third_party/
│   │   ├── README.md
│   │   ├── __init__.py
│   │   └── unidiff/
│   │       ├── __init__.py
│   │       ├── __version__.py
│   │       ├── constants.py
│   │       ├── errors.py
│   │       └── patch.py
│   ├── update_lists.py
│   ├── update_platform_patches.py
│   ├── validate_config.py
│   └── validate_patches.py
├── docs/
│   ├── building.md
│   ├── contributing.md
│   ├── design.md
│   ├── developing.md
│   ├── flags.md
│   ├── platforms.md
│   └── repo_management.md
├── domain_regex.list
├── domain_substitution.list
├── downloads.ini
├── flags.gn
├── patches/
│   ├── core/
│   │   ├── bromite/
│   │   │   └── disable-fetching-field-trials.patch
│   │   ├── debian/
│   │   │   └── disable/
│   │   │       └── unrar.patch
│   │   ├── inox-patchset/
│   │   │   ├── 0001-fix-building-without-safebrowsing.patch
│   │   │   ├── 0003-disable-autofill-download-manager.patch
│   │   │   ├── 0005-disable-default-extensions.patch
│   │   │   ├── 0007-disable-web-resource-service.patch
│   │   │   ├── 0009-disable-google-ipv6-probes.patch
│   │   │   ├── 0014-disable-translation-lang-fetch.patch
│   │   │   ├── 0015-disable-update-pings.patch
│   │   │   ├── 0017-disable-new-avatar-menu.patch
│   │   │   └── 0021-disable-rlz.patch
│   │   ├── iridium-browser/
│   │   │   ├── all-add-trk-prefixes-to-possibly-evil-connections.patch
│   │   │   ├── safe_browsing-disable-incident-reporting.patch
│   │   │   └── safe_browsing-disable-reporting-of-safebrowsing-over.patch
│   │   └── ungoogled-chromium/
│   │       ├── block-trk-and-subdomains.patch
│   │       ├── disable-crash-reporter.patch
│   │       ├── disable-domain-reliability.patch
│   │       ├── disable-fonts-googleapis-references.patch
│   │       ├── disable-gaia.patch
│   │       ├── disable-gcm.patch
│   │       ├── disable-google-host-detection.patch
│   │       ├── disable-mei-preload.patch
│   │       ├── disable-network-time-tracker.patch
│   │       ├── disable-profile-avatar-downloading.patch
│   │       ├── disable-signin.patch
│   │       ├── disable-translate.patch
│   │       ├── disable-untraceable-urls.patch
│   │       ├── disable-webrtc-log-uploader.patch
│   │       ├── disable-webstore-urls.patch
│   │       ├── fix-building-without-enabling-reporting.patch
│   │       ├── fix-building-without-one-click-signin.patch
│   │       ├── fix-building-without-safebrowsing.patch
│   │       ├── fix-learn-doubleclick-hsts.patch
│   │       ├── remove-unused-preferences-fields.patch
│   │       ├── replace-google-search-engine-with-nosearch.patch
│   │       └── use-local-devtools-files.patch
│   ├── extra/
│   │   ├── bromite/
│   │   │   ├── fingerprinting-flags-client-rects-and-measuretext.patch
│   │   │   ├── flag-fingerprinting-canvas-image-data-noise.patch
│   │   │   └── flag-max-connections-per-host.patch
│   │   ├── debian/
│   │   │   ├── disable/
│   │   │   │   ├── android.patch
│   │   │   │   ├── device-notifications.patch
│   │   │   │   ├── fuzzers.patch
│   │   │   │   ├── google-api-warning.patch
│   │   │   │   └── welcome-page.patch
│   │   │   ├── fixes/
│   │   │   │   └── connection-message.patch
│   │   │   ├── gn/
│   │   │   │   └── parallel.patch
│   │   │   └── warnings/
│   │   │       └── initialization.patch
│   │   ├── inox-patchset/
│   │   │   ├── 0006-modify-default-prefs.patch
│   │   │   ├── 0008-restore-classic-ntp.patch
│   │   │   ├── 0011-add-duckduckgo-search-engine.patch
│   │   │   ├── 0013-disable-missing-key-warning.patch
│   │   │   ├── 0016-chromium-sandbox-pie.patch
│   │   │   ├── 0018-disable-first-run-behaviour.patch
│   │   │   └── 0019-disable-battery-status-service.patch
│   │   ├── iridium-browser/
│   │   │   ├── Remove-EV-certificates.patch
│   │   │   ├── browser-disable-profile-auto-import-on-first-run.patch
│   │   │   ├── mime_util-force-text-x-suse-ymp-to-be-downloaded.patch
│   │   │   ├── net-cert-increase-default-key-length-for-newly-gener.patch
│   │   │   ├── prefs-always-prompt-for-download-directory-by-defaul.patch
│   │   │   ├── prefs-only-keep-cookies-until-exit.patch
│   │   │   └── updater-disable-auto-update.patch
│   │   └── ungoogled-chromium/
│   │       ├── add-components-ungoogled.patch
│   │       ├── add-flag-for-pdf-plugin-name.patch
│   │       ├── add-flag-for-search-engine-collection.patch
│   │       ├── add-flag-to-configure-extension-downloading.patch
│   │       ├── add-flag-to-disable-beforeunload.patch
│   │       ├── add-flag-to-force-punycode-hostnames.patch
│   │       ├── add-flag-to-hide-crashed-bubble.patch
│   │       ├── add-flag-to-scroll-tabs.patch
│   │       ├── add-flag-to-show-avatar-button.patch
│   │       ├── add-flag-to-stack-tabs.patch
│   │       ├── add-ipv6-probing-option.patch
│   │       ├── add-suggestions-url-field.patch
│   │       ├── default-to-https-scheme.patch
│   │       ├── disable-download-quarantine.patch
│   │       ├── disable-formatting-in-omnibox.patch
│   │       ├── disable-intranet-redirect-detector.patch
│   │       ├── disable-webgl-renderer-info.patch
│   │       ├── enable-checkbox-external-protocol.patch
│   │       ├── enable-page-saving-on-more-pages.patch
│   │       ├── enable-paste-and-go-new-tab-button.patch
│   │       ├── fix-building-without-mdns-and-service-discovery.patch
│   │       ├── popups-to-tabs.patch
│   │       ├── remove-disable-setuid-sandbox-as-bad-flag.patch
│   │       └── searx.patch
│   └── series
├── pruning.list
├── revision.txt
└── utils/
    ├── .coveragerc
    ├── __init__.py
    ├── _common.py
    ├── _extraction.py
    ├── domain_substitution.py
    ├── downloads.py
    ├── filescfg.py
    ├── patches.py
    ├── prune_binaries.py
    ├── pytest.ini
    ├── tests/
    │   ├── __init__.py
    │   ├── test_domain_substitution.py
    │   └── test_patches.py
    └── third_party/
        ├── README.md
        ├── __init__.py
        └── schema.py

================================================
FILE CONTENTS
================================================

================================================
FILE: .cirrus.yml
================================================
container:
    dockerfile: .cirrus_Dockerfile

code_check_task:
    pip_cache:
        folder: ~/.cache/pip
        fingerprint_script: cat .cirrus_requirements.txt
        populate_script: pip install -r .cirrus_requirements.txt
    pip_install_script:
        # Needed in order for yapf to be fully installed
        - pip install -r .cirrus_requirements.txt
    utils_script:
        - python3 -m yapf --style '.style.yapf' -e '*/third_party/*' -rpd utils
        - ./devutils/run_utils_pylint.py --hide-fixme
        - ./devutils/run_utils_tests.sh
    devutils_script:
        - python3 -m yapf --style '.style.yapf' -e '*/third_party/*' -rpd devutils
        - ./devutils/run_devutils_pylint.py --hide-fixme
        - ./devutils/run_devutils_tests.sh

validate_config_task:
    validate_config_script: ./devutils/validate_config.py

validate_with_source_task:
    chromium_download_cache:
        folder: chromium_download_cache
        fingerprint_script: cat chromium_version.txt
        populate_script:
            # This directory will not exist when this is called, unless cach retrieval
            # fails and leaves partially-complete files around.
            - rm -rf chromium_download_cache || true
            - mkdir chromium_download_cache
            - ./utils/downloads.py retrieve -i downloads.ini -c chromium_download_cache
    unpack_source_script:
        - ./utils/downloads.py unpack -i downloads.ini -c chromium_download_cache chromium_src
    validate_patches_script:
        - ./devutils/validate_patches.py -l chromium_src
    validate_lists_script:
        # NOTE: This check is prone to false positives, but not false negatives.
        - ./devutils/check_files_exist.py chromium_src pruning.list domain_substitution.list

# vim: set expandtab shiftwidth=4 softtabstop=4:


================================================
FILE: .cirrus_Dockerfile
================================================
# Dockerfile for Python 3 with xz-utils (for tar.xz unpacking)

FROM python:3.6-slim

RUN apt update && apt install -y xz-utils patch axel


================================================
FILE: .cirrus_requirements.txt
================================================
# Based on Python package versions in Debian buster
astroid==2.1.0 # via pylint
pylint==2.2.2
pytest-cov==2.6.0
pytest==3.10.1
requests==2.21.0
yapf==0.25.0


================================================
FILE: .github/ISSUE_TEMPLATE/bugreport.md
================================================
---
name: Bug report
about: Report a bug building or running ungoogled-chromium
title: ''
labels: ''
assignees: ''

---

**Describe the bug**
A clear and concise description of what the bug is.

**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error

**Expected behavior**
A clear and concise description of what you expected to happen.

**Screenshots**
If applicable, add screenshots to help explain your problem.

**Environment (please complete the following information):**
 - OS/Platform and version:
 - ungoogled-chromium version:

**Additional context**
Add any other context about the problem here.


================================================
FILE: .github/ISSUE_TEMPLATE/create-an--updating-to-chromium-x-x-x-x-.md
================================================
---
name: Create an "Updating to Chromium x.x.x.x"
about: For letting the community track progress to a new stable Chromium
title: Updating to Chromium [VERSION_HERE]
labels: enhancement, help wanted
assignees: ''

---

### Please respond if you would like to update ungoogled-chromium to the new stable Chromium version.

## Notes for the developer

Once you claim it, it is advisable that you create a [Draft Pull Request](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/about-pull-requests#draft-pull-requests) for increased visibility:

![GitHub Interface for creating Draft Pull Requests](https://help.github.com/assets/images/help/pull_requests/pullrequest-send.png)

Finally, make sure to reference this issue in your PR.

Feel free to ask questions and discuss issues here along the way. Thanks!

## Notes for others

Feel free to raise issues or questions throughout the process. However, please refrain from asking ETAs unless no visible progress has been made here or on the developer's PR for a while (e.g. 2 weeks).


================================================
FILE: .github/ISSUE_TEMPLATE/feature_request.md
================================================
---
name: Feature request
about: Suggest an idea
title: ''
labels: ''
assignees: ''

---

**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

**Describe the solution you'd like**
A clear and concise description of what you want to happen.

**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.

**Additional context**
Add any other context or screenshots about the feature request here.


================================================
FILE: .github/ISSUE_TEMPLATE/other.md
================================================
---
name: Other
about: Anything else not listed
title: ''
labels: ''
assignees: ''

---

*IMPORTANT: Please ensure you have read SUPPORT.md before continuing*


================================================
FILE: .github/PULL_REQUEST_TEMPLATE.md
================================================
*(Please ensure you have read SUPPORT.md and docs/contributing.md before submitting the Pull Request)*


================================================
FILE: .gitignore
================================================
# Python files
__pycache__/
*.py[cod]

# Python testing files
.coverage

# Ignore macOS Finder meta
.DS_Store
.tm_properties

# Ignore optional build / cache directory
/build


================================================
FILE: .style.yapf
================================================
[style]
based_on_style = pep8
allow_split_before_dict_value = false
coalesce_brackets = true
column_limit = 100
indent_width = 4
join_multiple_lines = true
spaces_before_comment = 1


================================================
FILE: LICENSE
================================================
BSD 3-Clause License

Copyright (c) 2015-2020, The ungoogled-chromium Authors
All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:

1. Redistributions of source code must retain the above copyright notice, this
   list of conditions and the following disclaimer.

2. Redistributions in binary form must reproduce the above copyright notice,
   this list of conditions and the following disclaimer in the documentation
   and/or other materials provided with the distribution.

3. Neither the name of the copyright holder nor the names of its
   contributors may be used to endorse or promote products derived from
   this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.


================================================
FILE: README.md
================================================
# ungoogled-chromium

*A lightweight approach to removing Google web service dependency*

**Help is welcome!** See the [docs/contributing.md](docs/contributing.md) document for more information.

## Objectives

In descending order of significance (i.e. most important objective first):

1. **ungoogled-chromium is Google Chromium, sans dependency on Google web services**.
2. **ungoogled-chromium retains the default Chromium experience as closely as possible**. Unlike other Chromium forks that have their own visions of a web browser, ungoogled-chromium is essentially a drop-in replacement for Chromium.
3. **ungoogled-chromium features tweaks to enhance privacy, control, and transparency**. However, almost all of these features must be manually activated or enabled. For more details, see [Feature Overview](#feature-overview).

In scenarios where the objectives conflict, the objective of higher significance should take precedence.

## Content Overview

* [Objectives](#objectives)
* [Motivation and Philosophy](#motivation-and-philosophy)
* [Feature Overview](#feature-overview)
* [**Downloads**](#downloads)
* [Source Code](#source-code)
* [**FAQ**](#faq)
* [Building Instructions](#building-instructions)
* [Design Documentation](#design-documentation)
* [**Contributing, Reporting, Contacting**](#contributing-reporting-contacting)
* [Credits](#credits)
* [Related Projects](#related-projects)
* [License](#license)

## Motivation and Philosophy

Without signing in to a Google Account, Chromium does pretty well in terms of security and privacy. However, Chromium still has some dependency on Google web services and binaries. In addition, Google designed Chromium to be easy and intuitive for users, which means they compromise on transparency and control of internal operations.

ungoogled-chromium addresses these issues in the following ways:

1. Remove all remaining background requests to any web services while building and running the browser
2. Remove all code specific to Google web services
3. Remove all uses of pre-made binaries from the source code, and replace them with user-provided alternatives when possible.
4. Disable features that inhibit control and transparency, and add or modify features that promote them (these changes will almost always require manual activation or enabling).

These features are implemented as configuration flags, patches, and custom scripts. For more details, consult the [Design Documentation](docs/design.md).

## Feature Overview

*This section overviews the features of ungoogled-chromium. For more detailed information, it is best to consult the source code.*

Contents of this section:

* [Key Features](#key-features)
* [Enhancing Features](#enhancing-features)
* [Borrowed Features](#borrowed-features)
* [Supported Platforms and Distributions](#supported-platforms-and-distributions)

### Key Features

*These are the core features introduced by ungoogled-chromium.*

* Disable functionality specific to Google domains (e.g. Google Host Detector, Google URL Tracker, Google Cloud Messaging, Google Hotwording, etc.)
    * This includes disabling [Safe Browsing](//en.wikipedia.org/wiki/Google_Safe_Browsing). Consult [the FAQ for the rationale](//ungoogled-software.github.io/ungoogled-chromium-wiki/faq#why-is-safe-browsing-disabled).
* Block internal requests to Google at runtime. This feature is a fail-safe measure for the above, in case Google changes or introduces new components that our patches do not disable. This feature is implemented by replacing many Google web domains in the source code with non-existent alternatives ending in `qjz9zk` (known as domain substitution; [see docs/design.md](docs/design.md#source-file-processors) for details), then [modifying Chromium to block its own requests with such domains](patches/core/ungoogled-chromium/block-trk-and-subdomains.patch). In other words, no connections are attempted to the `qjz9zk` domain.
* Strip binaries from the source code (known as binary pruning; [see docs/design.md](docs/design.md#source-file-processors) for details)
* Add many new command-line switches and `chrome://flags` entries to configure disabled-by-default features. See [docs/flags.md](docs/flags.md) for the exhaustive list.

### Enhancing Features

*These are the non-essential features introduced by ungoogled-chromium.*

* Use HTTPS by default when a URL scheme is not provided (e.g. Omnibox, bookmarks, command-line)
* Add *Suggestions URL* text field in the search engine editor (`chrome://settings/searchEngines`) for customizing search engine suggestions.
* Add more URL schemes allowed to save page schemes.
* Add Omnibox search provider "No Search" to allow disabling of searching
* Add a custom cross-platform build configuration and packaging wrapper for Chromium. It currently supports many Linux distributions, macOS, and Windows. (See [docs/design.md](docs/design.md) for details on the system.)
* Force all pop-ups into tabs
* Disable automatic formatting of URLs in Omnibox (e.g. stripping `http://`, hiding certain parameters)
* Disable intranet redirect detector (extraneous DNS requests)
    * This breaks captive portal detection, but captive portals still work.
* (Iridium Browser feature change) Prevent URLs with the `trk:` scheme from connecting to the Internet
    * Also prevents any URLs with the top-level domain `qjz9zk` (as used in domain substitution) from attempting a connection.
* (Iridium and Inox feature change) Prevent pinging of IPv6 address when detecting the availability of IPv6. See the `--set-ipv6-probe-false` flag above to adjust the behavior instead.
* (Windows-specific) Do not set the Zone Identifier on downloaded files

### Borrowed Features

In addition to the features introduced by ungoogled-chromium, ungoogled-chromium selectively borrows many features from the following projects (in approximate order of significance):

* [Inox patchset](//github.com/gcarq/inox-patchset)
* [Bromite](//github.com/bromite/bromite)
* [Debian](//tracker.debian.org/pkg/chromium-browser)
* [Iridium Browser](//iridiumbrowser.de/)

### Supported Platforms and Distributions

[See docs/platforms.md for a list of supported platforms](docs/platforms.md).

Other platforms are discussed and tracked in this repository's Issue Tracker. Learn more about using the Issue Tracker under the section [Contributing, Reporting, Contacting](#contributing-reporting-contacting).

## Downloads

[**Download binaries from here**](//ungoogled-software.github.io/ungoogled-chromium-binaries/)

*NOTE: These binaries are provided by anyone who are willing to build and submit them. Because these binaries are not necessarily [reproducible](https://reproducible-builds.org/), authenticity cannot be guaranteed; In other words, there is always a non-zero probability that these binaries may have been tampered with. In the unlikely event that this has happened to you, please [report it in a new issue](#contributing-reporting-contacting).*

These binaries are known as **contributor binaries**.

Also, ungoogled-chromium is available in several **software repositories**:

* Android: Available via a custom [F-Droid](https://f-droid.org/) repo. [See instructions in ungoogled-chromium-android](https://github.com/wchen342/ungoogled-chromium-android#f-droid-repository)
* Arch: Available in [AUR](https://aur.archlinux.org/) as [`ungoogled-chromium`](https://aur.archlinux.org/packages/ungoogled-chromium/)
* Debian & Ubuntu: Available in OBS as [`ungoogled-chromium`](https://software.opensuse.org/download/package?package=ungoogled-chromium&project=home:ungoogled_chromium)
* Fedora: Available in [RPM Fusion](https://rpmfusion.org) as `chromium-browser-privacy`
* Gentoo: Available in [`::pf4public`](https://github.com/PF4Public/gentoo-overlay) overlay as [`ungoogled-chromium`](https://github.com/PF4Public/gentoo-overlay/tree/master/www-client/ungoogled-chromium) ebuild
* GNU Guix: Available as `ungoogled-chromium`
* macOS: Available in [Homebrew](https://brew.sh/) as [`eloston-chromium`](https://formulae.brew.sh/cask/eloston-chromium). Just run `brew cask fetch eloston-chromium` and `brew cask install eloston-chromium`. Chromium will appear in your `/Applications` directory.
* NixOS/nixpkgs: Available as `ungoogled-chromium`

## Source Code

This repository only contains the common code for all platforms; it does not contain all the configuration and scripts necessary to build ungoogled-chromium. Most users will want to use platform-specific repos, where all the remaining configuration and scripts are provided for specific platforms:

[**Find the repo for a specific platform here**](docs/platforms.md).

If you wish to include ungoogled-chromium code in your own build process, consider using [the tags in this repo](//github.com/Eloston/ungoogled-chromium/tags). These tags follow the format `{chromium_version}-{revision}` where

* `chromium_version` is the version of Chromium used in `x.x.x.x` format, and
* `revision` is a number indicating the version of ungoogled-chromium for the corresponding Chromium version.

Additionally, most platform-specific repos extend their tag scheme upon this one.

**Building the source code**: [See docs/building.md](docs/building.md)

### Mirrors

List of mirrors:

* [Codeberg](https://codeberg.org): [main repo](https://codeberg.org/Eloston/ungoogled-chromium) and [ungoogled-software](https://codeberg.org/ungoogled-software)

## FAQ

[See the frequently-asked questions (FAQ) on the Wiki](//ungoogled-software.github.io/ungoogled-chromium-wiki/faq)

## Building Instructions

[See docs/building.md](docs/building.md)

## Design Documentation

[See docs/design.md](docs/design.md)

## Contributing, Reporting, Contacting

* For reporting and contacting, see [SUPPORT.md](SUPPORT.md)
* For contributing (e.g. how to help, submitting changes, criteria for new features), see [docs/contributing.md](docs/contributing.md)
* If you have some small contributions that don't fit our criteria, consider adding them to [ungoogled-software/contrib](https://github.com/ungoogled-software/contrib) or [our Wiki](https://github.com/ungoogled-software/ungoogled-chromium-wiki) instead.

## Credits

* [The Chromium Project](//www.chromium.org/)
* [Inox patchset](//github.com/gcarq/inox-patchset)
* [Debian](//tracker.debian.org/pkg/chromium-browser)
* [Bromite](//github.com/bromite/bromite)
* [Iridium Browser](//iridiumbrowser.de/)
* The users for testing and debugging, [contributing code](//github.com/Eloston/ungoogled-chromium/graphs/contributors), providing feedback, or simply using ungoogled-chromium in some capacity.

## Related Projects

List of known projects that fork or use changes from ungoogled-chromium:

* [Bromite](//github.com/bromite/bromite) (Borrows some patches. Features builds for Android)
* [ppc64le fork](//github.com/leo-lb/ungoogled-chromium) (Fork with changes to build for ppc64le CPUs)

## License

BSD-3-clause. See [LICENSE](LICENSE)


================================================
FILE: SUPPORT.md
================================================
# Support

**Before you submit feedback, please ensure you have tried the following**: 

* Read the [FAQ](//ungoogled-software.github.io/ungoogled-chromium-wiki/faq)
* Check if your feedback already exists in the [Issue Tracker](//github.com/Eloston/ungoogled-chromium/issues) (make sure to search closed issues and use search filters, as applicable)
	* [If your platform is officially supported (links here)](docs/platforms.md), make sure to visit the issue trackers for the platform's repository.
* If this is a problem, ensure it does *not* occur with regular Chromium or Google Chrome. If it does, then this is *not* a problem with ungoogled-chromium. Instead, please submit your feedback to the [Chromium bug tracker](//bugs.chromium.org/p/chromium/issues/list) or Google.
* Read the documentation under [docs/](docs/)

There are two channels for support:

* The [Issue Tracker](//github.com/Eloston/ungoogled-chromium/issues). The Issue Tracker is the main hub for discussions and development activity, and thus the primary means of support. It includes problems, suggestions, and questions.
	* [If your platform is officially supported (links here)](docs/platforms.md), there are also platform-specific issue trackers. You may post an issue there if you believe it is an issue specific to that platform.
* A chat room. There are two options available:
    * [Gitter](https://gitter.im/ungoogled-software/Lobby). It can use your GitHub account as an identity.
    * Matrix.org under name `ungoogled-software/lobby`. It has a bidirectional connection with Gitter.


================================================
FILE: chromium_version.txt
================================================
83.0.4103.116


================================================
FILE: devutils/.coveragerc
================================================
[run]
branch = True
parallel = True
omit = tests/*

[report]
# Regexes for lines to exclude from consideration
exclude_lines =
    # Have to re-enable the standard pragma
    pragma: no cover

    # Don't complain about missing debug-only code:
    def __repr__
    if self\.debug

    # Don't complain if tests don't hit defensive assertion code:
    raise AssertionError
    raise NotImplementedError

    # Don't complain if non-runnable code isn't run:
    if 0:
    if __name__ == .__main__.:


================================================
FILE: devutils/README.md
================================================
# Developer utilities for ungoogled-chromium

This is a collection of scripts written for developing on ungoogled-chromium. See the descriptions at the top of each script for more information.


================================================
FILE: devutils/__init__.py
================================================


================================================
FILE: devutils/check_all_code.sh
================================================
#!/bin/bash

# Wrapper for devutils and utils formatter, linter, and tester

set -eu

_root_dir=$(dirname $(dirname $(readlink -f $0)))
cd ${_root_dir}/devutils

printf '###### utils yapf ######\n'
./run_utils_yapf.sh
printf '###### utils pylint ######\n'
./run_utils_pylint.py || ./run_utils_pylint.py --hide-fixme
printf '###### utils tests ######\n'
./run_utils_tests.sh

printf '### devutils yapf ######\n'
./run_devutils_yapf.sh
printf '###### devutils pylint ######\n'
./run_devutils_pylint.py || ./run_devutils_pylint.py --hide-fixme
printf '###### devutils tests ######\n'
./run_devutils_tests.sh


================================================
FILE: devutils/check_downloads_ini.py
================================================
#!/usr/bin/env python3
# -*- coding: UTF-8 -*-

# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""Run sanity checking algorithms over downloads.ini files

It checks the following:

    * downloads.ini has the correct format (i.e. conforms to its schema)

Exit codes:
    * 0 if no problems detected
    * 1 if warnings or errors occur
"""

import argparse
import sys
from pathlib import Path

sys.path.insert(0, str(Path(__file__).resolve().parent.parent / 'utils'))
from downloads import DownloadInfo, schema
sys.path.pop(0)


def check_downloads_ini(downloads_ini_iter):
    """
    Combines and checks if the the downloads.ini files provided are valid.

    downloads_ini_iter must be an iterable of strings to downloads.ini files.

    Returns True if errors occured, False otherwise.
    """
    try:
        DownloadInfo(downloads_ini_iter)
    except schema.SchemaError:
        return True
    return False


def main():
    """CLI entrypoint"""

    root_dir = Path(__file__).resolve().parent.parent
    default_downloads_ini = [str(root_dir / 'downloads.ini')]

    parser = argparse.ArgumentParser(description=__doc__)
    parser.add_argument(
        '-d',
        '--downloads-ini',
        type=Path,
        nargs='*',
        default=default_downloads_ini,
        help='List of downloads.ini files to check. Default: %(default)s')
    args = parser.parse_args()

    if check_downloads_ini(args.downloads_ini):
        exit(1)
    exit(0)


if __name__ == '__main__':
    main()


================================================
FILE: devutils/check_files_exist.py
================================================
#!/usr/bin/env python3

# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""
Checks if files in a list exist.

Used for quick validation of lists in CI checks.
"""

import argparse
import sys
from pathlib import Path


def main():
    """CLI entrypoint"""
    parser = argparse.ArgumentParser(description=__doc__)
    parser.add_argument('root_dir', type=Path, help='The directory to check from')
    parser.add_argument('input_files', type=Path, nargs='+', help='The files lists to check')
    args = parser.parse_args()

    for input_name in args.input_files:
        file_iter = filter(
            len, map(str.strip,
                     Path(input_name).read_text(encoding='UTF-8').splitlines()))
        for file_name in file_iter:
            if not Path(args.root_dir, file_name).exists():
                print(
                    'ERROR: Path "{}" from file "{}" does not exist.'.format(file_name, input_name),
                    file=sys.stderr)
                exit(1)


if __name__ == "__main__":
    main()


================================================
FILE: devutils/check_gn_flags.py
================================================
#!/usr/bin/env python3
# -*- coding: UTF-8 -*-

# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""Run sanity checking algorithms over GN flags

It checks the following:

    * GN flags in flags.gn are sorted and not duplicated

Exit codes:
    * 0 if no problems detected
    * 1 if warnings or errors occur
"""

import argparse
import sys
from pathlib import Path

sys.path.insert(0, str(Path(__file__).resolve().parent.parent / 'utils'))
from _common import ENCODING, get_logger
sys.path.pop(0)


def check_gn_flags(gn_flags_path):
    """
    Checks if GN flags are sorted and not duplicated.

    gn_flags_path is a pathlib.Path to the GN flags file to check

    Returns True if warnings were logged; False otherwise
    """
    keys_seen = set()
    warnings = False
    with gn_flags_path.open(encoding=ENCODING) as file_obj:
        iterator = iter(file_obj.read().splitlines())
    try:
        previous = next(iterator)
    except StopIteration:
        return warnings
    for current in iterator:
        gn_key = current.split('=')[0]
        if gn_key in keys_seen:
            get_logger().warning('In GN flags %s, "%s" appears at least twice', gn_flags_path,
                                 gn_key)
            warnings = True
        else:
            keys_seen.add(gn_key)
        if current < previous:
            get_logger().warning('In GN flags %s, "%s" should be sorted before "%s"', gn_flags_path,
                                 current, previous)
            warnings = True
        previous = current
    return warnings


def main():
    """CLI entrypoint"""

    root_dir = Path(__file__).resolve().parent.parent
    default_flags_gn = root_dir / 'flags.gn'

    parser = argparse.ArgumentParser(description=__doc__)
    parser.add_argument(
        '-f',
        '--flags-gn',
        type=Path,
        default=default_flags_gn,
        help='Path to the GN flags to use. Default: %(default)s')
    args = parser.parse_args()

    if check_gn_flags(args.flags_gn):
        exit(1)
    exit(0)


if __name__ == '__main__':
    main()


================================================
FILE: devutils/check_patch_files.py
================================================
#!/usr/bin/env python3
# -*- coding: UTF-8 -*-

# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""Run sanity checking algorithms over ungoogled-chromium's patch files

It checks the following:

    * All patches exist
    * All patches are referenced by the patch order

Exit codes:
    * 0 if no problems detected
    * 1 if warnings or errors occur
"""

import argparse
import sys
from pathlib import Path

from third_party import unidiff

sys.path.insert(0, str(Path(__file__).resolve().parent.parent / 'utils'))
from _common import ENCODING, get_logger, parse_series
sys.path.pop(0)

# File suffixes to ignore for checking unused patches
_PATCHES_IGNORE_SUFFIXES = {'.md'}


def _read_series_file(patches_dir, series_file, join_dir=False):
    """
    Returns a generator over the entries in the series file

    patches_dir is a pathlib.Path to the directory of patches
    series_file is a pathlib.Path relative to patches_dir

    join_dir indicates if the patches_dir should be joined with the series entries
    """
    for entry in parse_series(patches_dir / series_file):
        if join_dir:
            yield patches_dir / entry
        else:
            yield entry


def check_patch_readability(patches_dir, series_path=Path('series')):
    """
    Check if the patches from iterable patch_path_iter are readable.
        Patches that are not are logged to stdout.

    Returns True if warnings occured, False otherwise.
    """
    warnings = False
    for patch_path in _read_series_file(patches_dir, series_path, join_dir=True):
        if patch_path.exists():
            with patch_path.open(encoding=ENCODING) as file_obj:
                try:
                    unidiff.PatchSet(file_obj.read())
                except unidiff.errors.UnidiffParseError:
                    get_logger().exception('Could not parse patch: %s', patch_path)
                    warnings = True
                    continue
        else:
            get_logger().warning('Patch not found: %s', patch_path)
            warnings = True
    return warnings


def check_unused_patches(patches_dir, series_path=Path('series')):
    """
    Checks if there are unused patches in patch_dir from series file series_path.
        Unused patches are logged to stdout.

    patches_dir is a pathlib.Path to the directory of patches
    series_path is a pathlib.Path to the series file relative to the patches_dir

    Returns True if there are unused patches; False otherwise.
    """
    unused_patches = set()
    for path in patches_dir.rglob('*'):
        if path.is_dir():
            continue
        if path.suffix in _PATCHES_IGNORE_SUFFIXES:
            continue
        unused_patches.add(str(path.relative_to(patches_dir)))
    unused_patches -= set(_read_series_file(patches_dir, series_path))
    unused_patches.remove(str(series_path))
    logger = get_logger()
    for entry in sorted(unused_patches):
        logger.warning('Unused patch: %s', entry)
    return bool(unused_patches)


def check_series_duplicates(patches_dir, series_path=Path('series')):
    """
    Checks if there are duplicate entries in the series file

    series_path is a pathlib.Path to the series file relative to the patches_dir

    returns True if there are duplicate entries; False otherwise.
    """
    entries_seen = set()
    for entry in _read_series_file(patches_dir, series_path):
        if entry in entries_seen:
            get_logger().warning('Patch appears more than once in series: %s', entry)
            return True
        entries_seen.add(entry)
    return False


def main():
    """CLI entrypoint"""

    root_dir = Path(__file__).resolve().parent.parent
    default_patches_dir = root_dir / 'patches'

    parser = argparse.ArgumentParser(description=__doc__)
    parser.add_argument(
        '-p',
        '--patches',
        type=Path,
        default=default_patches_dir,
        help='Path to the patches directory to use. Default: %(default)s')
    args = parser.parse_args()

    warnings = False
    warnings |= check_patch_readability(args.patches)
    warnings |= check_series_duplicates(args.patches)
    warnings |= check_unused_patches(args.patches)

    if warnings:
        exit(1)
    exit(0)


if __name__ == '__main__':
    main()


================================================
FILE: devutils/print_tag_version.sh
================================================
_root_dir=$(dirname $(dirname $(readlink -f $0)))
printf '%s-%s' $(cat $_root_dir/chromium_version.txt) $(cat $_root_dir/revision.txt)


================================================
FILE: devutils/pytest.ini
================================================
[pytest]
testpaths = tests
#filterwarnings =
#	error
#	ignore::DeprecationWarning
#addopts = --cov-report term-missing --hypothesis-show-statistics -p no:warnings
# Live logging
#log_cli=true
#log_level=DEBUG
addopts = -capture=all --cov=. --cov-config=.coveragerc --cov-report term-missing -p no:warnings


================================================
FILE: devutils/run_devutils_pylint.py
================================================
#!/usr/bin/env python3

# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""Run Pylint over devutils"""

import argparse
import sys
from pathlib import Path

from run_other_pylint import ChangeDir, run_pylint


def main():
    """CLI entrypoint"""
    parser = argparse.ArgumentParser(description='Run Pylint over devutils')
    parser.add_argument('--hide-fixme', action='store_true', help='Hide "fixme" Pylint warnings.')
    parser.add_argument(
        '--show-locally-disabled',
        action='store_true',
        help='Show "locally-disabled" Pylint warnings.')
    args = parser.parse_args()

    disables = [
        'wrong-import-position',
        'bad-continuation',
    ]

    if args.hide_fixme:
        disables.append('fixme')
    if not args.show_locally_disabled:
        disables.append('locally-disabled')

    pylint_options = [
        '--disable={}'.format(','.join(disables)),
        '--jobs=4',
        '--score=n',
        '--persistent=n',
    ]

    ignore_prefixes = [
        ('third_party', ),
    ]

    sys.path.insert(1, str(Path(__file__).resolve().parent.parent / 'utils'))
    sys.path.insert(2, str(Path(__file__).resolve().parent.parent / 'devutils' / 'third_party'))
    with ChangeDir(Path(__file__).parent):
        result = run_pylint(
            Path(),
            pylint_options,
            ignore_prefixes=ignore_prefixes,
        )
    sys.path.pop(2)
    sys.path.pop(1)
    if not result:
        exit(1)
    exit(0)


if __name__ == '__main__':
    main()


================================================
FILE: devutils/run_devutils_tests.sh
================================================
#!/bin/bash

set -eu

_root_dir=$(dirname $(dirname $(readlink -f $0)))
cd ${_root_dir}/devutils
python3 -m pytest -c ${_root_dir}/devutils/pytest.ini


================================================
FILE: devutils/run_devutils_yapf.sh
================================================
#!/bin/bash

set -eu

_current_dir=$(dirname $(readlink -f $0))
_root_dir=$(dirname $_current_dir)
python3 -m yapf --style "$_root_dir/.style.yapf" -e '*/third_party/*' -rpi "$_current_dir"


================================================
FILE: devutils/run_other_pylint.py
================================================
#!/usr/bin/env python3

# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""Run Pylint over any module"""

import argparse
import os
import shutil
from pathlib import Path

from pylint import lint


class ChangeDir:
    """
    Changes directory to path in with statement
    """

    def __init__(self, path):
        self._path = path
        self._orig_path = os.getcwd()

    def __enter__(self):
        os.chdir(str(self._path))

    def __exit__(self, *_):
        os.chdir(self._orig_path)


def run_pylint(module_path, pylint_options, ignore_prefixes=tuple()):
    """Runs Pylint. Returns a boolean indicating success"""
    pylint_stats = Path('/run/user/{}/pylint_stats'.format(os.getuid()))
    if not pylint_stats.parent.is_dir(): #pylint: disable=no-member
        pylint_stats = Path('/run/shm/pylint_stats')
    os.environ['PYLINTHOME'] = str(pylint_stats)

    input_paths = list()
    if not module_path.exists():
        print('ERROR: Cannot find', module_path)
        exit(1)
    if module_path.is_dir():
        for path in module_path.rglob('*.py'):
            ignore_matched = False
            for prefix in ignore_prefixes:
                if path.parts[:len(prefix)] == prefix:
                    ignore_matched = True
                    break
            if ignore_matched:
                continue
            input_paths.append(str(path))
    else:
        input_paths.append(str(module_path))
    runner = lint.Run((*input_paths, *pylint_options), do_exit=False)

    if pylint_stats.is_dir():
        shutil.rmtree(str(pylint_stats))

    if runner.linter.msg_status != 0:
        print('WARNING: Non-zero exit status:', runner.linter.msg_status)
        return False
    return True


def main():
    """CLI entrypoint"""

    parser = argparse.ArgumentParser(description='Run Pylint over arbitrary module')
    parser.add_argument('--hide-fixme', action='store_true', help='Hide "fixme" Pylint warnings.')
    parser.add_argument(
        '--show-locally-disabled',
        action='store_true',
        help='Show "locally-disabled" Pylint warnings.')
    parser.add_argument('module_path', type=Path, help='Path to the module to check')
    args = parser.parse_args()

    if not args.module_path.exists():
        print('ERROR: Module path "{}" does not exist'.format(args.module_path))
        exit(1)

    disables = [
        'wrong-import-position',
        'bad-continuation',
    ]

    if args.hide_fixme:
        disables.append('fixme')
    if not args.show_locally_disabled:
        disables.append('locally-disabled')

    pylint_options = [
        '--disable={}'.format(','.join(disables)),
        '--jobs=4',
        '--score=n',
        '--persistent=n',
    ]

    if not run_pylint(args.module_path, pylint_options):
        exit(1)
    exit(0)


if __name__ == '__main__':
    main()


================================================
FILE: devutils/run_other_yapf.sh
================================================
#!/bin/bash

set -eu

python3 -m yapf --style "$(dirname $(readlink -f $0))/.style.yapf" -rpi $@


================================================
FILE: devutils/run_utils_pylint.py
================================================
#!/usr/bin/env python3

# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""Run Pylint over utils"""

import argparse
import sys
from pathlib import Path

from run_other_pylint import ChangeDir, run_pylint


def main():
    """CLI entrypoint"""
    parser = argparse.ArgumentParser(description='Run Pylint over utils')
    parser.add_argument('--hide-fixme', action='store_true', help='Hide "fixme" Pylint warnings.')
    parser.add_argument(
        '--show-locally-disabled',
        action='store_true',
        help='Show "locally-disabled" Pylint warnings.')
    args = parser.parse_args()

    disable = ['bad-continuation']

    if args.hide_fixme:
        disable.append('fixme')
    if not args.show_locally_disabled:
        disable.append('locally-disabled')

    pylint_options = [
        '--disable={}'.format(','.join(disable)),
        '--jobs=4',
        '--score=n',
        '--persistent=n',
    ]

    ignore_prefixes = [
        ('third_party', ),
        ('tests', ),
    ]

    sys.path.insert(1, str(Path(__file__).resolve().parent.parent / 'utils' / 'third_party'))
    with ChangeDir(Path(__file__).resolve().parent.parent / 'utils'):
        result = run_pylint(
            Path(),
            pylint_options,
            ignore_prefixes=ignore_prefixes,
        )
    sys.path.pop(1)
    if not result:
        exit(1)
    exit(0)


if __name__ == '__main__':
    main()


================================================
FILE: devutils/run_utils_tests.sh
================================================
#!/bin/bash

set -eu

_root_dir=$(dirname $(dirname $(readlink -f $0)))
cd ${_root_dir}/utils
python3 -m pytest -c ${_root_dir}/utils/pytest.ini


================================================
FILE: devutils/run_utils_yapf.sh
================================================
#!/bin/bash

set -eu

_root_dir=$(dirname $(dirname $(readlink -f $0)))
python3 -m yapf --style "$_root_dir/.style.yapf" -e '*/third_party/*' -rpi "$_root_dir/utils"


================================================
FILE: devutils/set_quilt_vars.sh
================================================
# Sets quilt variables for updating the patches
# Make sure to run this with the shell command "source" in order to inherit the variables into the interactive environment

# There is some problem with the absolute paths in QUILT_PATCHES and QUILT_SERIES breaking quilt
# (refresh and diff don't read QUILT_*_ARGS, and series displays absolute paths instead of relative)
# Specifying a quiltrc file fixes this, so "--quiltrc -" fixes this too.
# One side effect of '--quiltrc -' is that we lose default settings from /etc/quilt.quiltrc, so they are redefined below.
alias quilt='quilt --quiltrc -'

# Assume this script lives within the repository
REPO_ROOT=$(dirname "$(dirname "$(readlink -f "${BASH_SOURCE[0]:-${(%):-%x}}")")")

export QUILT_PATCHES="$REPO_ROOT/patches"
#export QUILT_SERIES=$(readlink -f "$REPO_ROOT/patches/series")

# Options below borrowed from Debian and default quilt options (from /etc/quilt.quiltrc on Debian)
export QUILT_PUSH_ARGS="--color=auto"
export QUILT_DIFF_OPTS="--show-c-function"
export QUILT_PATCH_OPTS="--unified --reject-format=unified"
export QUILT_DIFF_ARGS="-p ab --no-timestamps --no-index --color=auto --sort"
export QUILT_REFRESH_ARGS="-p ab --no-timestamps --no-index --sort --strip-trailing-whitespace"
export QUILT_COLORS="diff_hdr=1;32:diff_add=1;34:diff_rem=1;31:diff_hunk=1;33:diff_ctx=35:diff_cctx=33"
export QUILT_SERIES_ARGS="--color=auto"
export QUILT_PATCHES_ARGS="--color=auto"

# When non-default less options are used, add the -R option so that less outputs
# ANSI color escape codes "raw".
[ -n "$LESS" -a -z "${QUILT_PAGER+x}" ] && export QUILT_PAGER="less -FRX"


================================================
FILE: devutils/tests/__init__.py
================================================


================================================
FILE: devutils/tests/test_check_patch_files.py
================================================
# -*- coding: UTF-8 -*-

# Copyright (c) 2020 The ungoogled-chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""Test check_patch_files.py"""

import tempfile
from pathlib import Path

from ..check_patch_files import check_series_duplicates


def test_check_series_duplicates():
    """Test check_series_duplicates"""
    with tempfile.TemporaryDirectory() as tmpdirname:
        patches_dir = Path(tmpdirname)
        series_path = Path(tmpdirname, 'series')

        # Check no duplicates
        series_path.write_text('\n'.join([
            'a.patch',
            'b.patch',
            'c.patch',
        ]))
        assert not check_series_duplicates(patches_dir)

        # Check duplicates
        series_path.write_text('\n'.join([
            'a.patch',
            'b.patch',
            'c.patch',
            'a.patch',
        ]))
        assert check_series_duplicates(patches_dir)


================================================
FILE: devutils/tests/test_validate_patches.py
================================================
# -*- coding: UTF-8 -*-

# Copyright (c) 2020 The ungoogled-chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""Test validate_patches.py"""

import logging
import tempfile
import sys
from pathlib import Path

sys.path.insert(0, str(Path(__file__).resolve().parent.parent.parent / 'utils'))
from _common import LOGGER_NAME
sys.path.pop(0)

from .. import validate_patches


def test_test_patches(caplog):
    """Test _dry_check_patched_file"""

    #pylint: disable=protected-access
    caplog.set_level(logging.DEBUG, logger=LOGGER_NAME)
    #set_logging_level(logging.DEBUG)

    orig_file_content = """bye world"""
    series_iter = ['test.patch']

    def _run_test_patches(patch_content):
        with tempfile.TemporaryDirectory() as tmpdirname:
            Path(tmpdirname, 'foobar.txt').write_text(orig_file_content)
            Path(tmpdirname, 'test.patch').write_text(patch_content)
            _, patch_cache = validate_patches._load_all_patches(series_iter, Path(tmpdirname))
            required_files = validate_patches._get_required_files(patch_cache)
            files_under_test = validate_patches._retrieve_local_files(required_files,
                                                                      Path(tmpdirname))
            return validate_patches._test_patches(series_iter, patch_cache, files_under_test)

    # Check valid modification
    patch_content = """--- a/foobar.txt
+++ b/foobar.txt
@@ -1 +1 @@
-bye world
+hello world
"""
    assert not _run_test_patches(patch_content)

    # Check invalid modification
    patch_content = """--- a/foobar.txt
+++ b/foobar.txt
@@ -1 +1 @@
-hello world
+olleh world
"""
    assert _run_test_patches(patch_content)

    # Check correct removal
    patch_content = """--- a/foobar.txt
+++ /dev/null
@@ -1 +0,0 @@
-bye world
"""
    assert not _run_test_patches(patch_content)

    # Check incorrect removal
    patch_content = """--- a/foobar.txt
+++ /dev/null
@@ -1 +0,0 @@
-this line does not exist in foobar
"""
    assert _run_test_patches(patch_content)


================================================
FILE: devutils/third_party/README.md
================================================
This directory contains third-party libraries used by devutils.

Contents:

* [python-unidiff](//github.com/matiasb/python-unidiff)
    * For parsing and modifying unified diffs.


================================================
FILE: devutils/third_party/__init__.py
================================================


================================================
FILE: devutils/third_party/unidiff/__init__.py
================================================
# -*- coding: utf-8 -*-

# The MIT License (MIT)
# Copyright (c) 2014-2017 Matias Bordese
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
# OR OTHER DEALINGS IN THE SOFTWARE.


"""Unidiff parsing library."""

from __future__ import unicode_literals

from . import __version__
from .patch import (
    DEFAULT_ENCODING,
    LINE_TYPE_ADDED,
    LINE_TYPE_CONTEXT,
    LINE_TYPE_REMOVED,
    Hunk,
    PatchedFile,
    PatchSet,
    UnidiffParseError,
)

VERSION = __version__.__version__


================================================
FILE: devutils/third_party/unidiff/__version__.py
================================================
# -*- coding: utf-8 -*-

# The MIT License (MIT)
# Copyright (c) 2014-2017 Matias Bordese
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
# OR OTHER DEALINGS IN THE SOFTWARE.

__version__ = '0.5.5'


================================================
FILE: devutils/third_party/unidiff/constants.py
================================================
# -*- coding: utf-8 -*-

# The MIT License (MIT)
# Copyright (c) 2014-2017 Matias Bordese
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
# OR OTHER DEALINGS IN THE SOFTWARE.


"""Useful constants and regexes used by the package."""

from __future__ import unicode_literals

import re


RE_SOURCE_FILENAME = re.compile(
    r'^--- (?P<filename>[^\t\n]+)(?:\t(?P<timestamp>[^\n]+))?')
RE_TARGET_FILENAME = re.compile(
    r'^\+\+\+ (?P<filename>[^\t\n]+)(?:\t(?P<timestamp>[^\n]+))?')

# @@ (source offset, length) (target offset, length) @@ (section header)
RE_HUNK_HEADER = re.compile(
    r"^@@ -(\d+)(?:,(\d+))? \+(\d+)(?:,(\d+))?\ @@[ ]?(.*)")

#    kept line (context)
# \n empty line (treat like context)
# +  added line
# -  deleted line
# \  No newline case
RE_HUNK_BODY_LINE = re.compile(
    r'^(?P<line_type>[- \+\\])(?P<value>.*)', re.DOTALL)
RE_HUNK_EMPTY_BODY_LINE = re.compile(
    r'^(?P<line_type>[- \+\\]?)(?P<value>[\r\n]{1,2})', re.DOTALL)

RE_NO_NEWLINE_MARKER = re.compile(r'^\\ No newline at end of file')

DEFAULT_ENCODING = 'UTF-8'

LINE_TYPE_ADDED = '+'
LINE_TYPE_REMOVED = '-'
LINE_TYPE_CONTEXT = ' '
LINE_TYPE_EMPTY = ''
LINE_TYPE_NO_NEWLINE = '\\'
LINE_VALUE_NO_NEWLINE = ' No newline at end of file'


================================================
FILE: devutils/third_party/unidiff/errors.py
================================================
# -*- coding: utf-8 -*-

# The MIT License (MIT)
# Copyright (c) 2014-2017 Matias Bordese
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
# OR OTHER DEALINGS IN THE SOFTWARE.


"""Errors and exceptions raised by the package."""

from __future__ import unicode_literals


class UnidiffParseError(Exception):
    """Exception when parsing the unified diff data."""


================================================
FILE: devutils/third_party/unidiff/patch.py
================================================
# -*- coding: utf-8 -*-

# The MIT License (MIT)
# Copyright (c) 2014-2017 Matias Bordese
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
# OR OTHER DEALINGS IN THE SOFTWARE.


"""Classes used by the unified diff parser to keep the diff data."""

from __future__ import unicode_literals

import codecs
import sys

from .constants import (
    DEFAULT_ENCODING,
    LINE_TYPE_ADDED,
    LINE_TYPE_CONTEXT,
    LINE_TYPE_EMPTY,
    LINE_TYPE_REMOVED,
    LINE_TYPE_NO_NEWLINE,
    LINE_VALUE_NO_NEWLINE,
    RE_HUNK_BODY_LINE,
    RE_HUNK_EMPTY_BODY_LINE,
    RE_HUNK_HEADER,
    RE_SOURCE_FILENAME,
    RE_TARGET_FILENAME,
    RE_NO_NEWLINE_MARKER,
)
from .errors import UnidiffParseError


PY2 = sys.version_info[0] == 2
if PY2:
    from StringIO import StringIO
    open_file = codecs.open
    make_str = lambda x: x.encode(DEFAULT_ENCODING)

    def implements_to_string(cls):
        cls.__unicode__ = cls.__str__
        cls.__str__ = lambda x: x.__unicode__().encode(DEFAULT_ENCODING)
        return cls
else:
    from io import StringIO
    open_file = open
    make_str = str
    implements_to_string = lambda x: x
    unicode = str
    basestring = str


@implements_to_string
class Line(object):
    """A diff line."""

    def __init__(self, value, line_type,
                 source_line_no=None, target_line_no=None, diff_line_no=None):
        super(Line, self).__init__()
        self.source_line_no = source_line_no
        self.target_line_no = target_line_no
        self.diff_line_no = diff_line_no
        self.line_type = line_type
        self.value = value

    def __repr__(self):
        return make_str("<Line: %s%s>") % (self.line_type, self.value)

    def __str__(self):
        return "%s%s" % (self.line_type, self.value)

    def __eq__(self, other):
        return (self.source_line_no == other.source_line_no and
                self.target_line_no == other.target_line_no and
                self.diff_line_no == other.diff_line_no and
                self.line_type == other.line_type and
                self.value == other.value)

    @property
    def is_added(self):
        return self.line_type == LINE_TYPE_ADDED

    @property
    def is_removed(self):
        return self.line_type == LINE_TYPE_REMOVED

    @property
    def is_context(self):
        return self.line_type == LINE_TYPE_CONTEXT


@implements_to_string
class PatchInfo(list):
    """Lines with extended patch info.

    Format of this info is not documented and it very much depends on
    patch producer.

    """

    def __repr__(self):
        value = "<PatchInfo: %s>" % self[0].strip()
        return make_str(value)

    def __str__(self):
        return ''.join(unicode(line) for line in self)


@implements_to_string
class Hunk(list):
    """Each of the modified blocks of a file."""

    def __init__(self, src_start=0, src_len=0, tgt_start=0, tgt_len=0,
                 section_header=''):
        if src_len is None:
            src_len = 1
        if tgt_len is None:
            tgt_len = 1
        self.added = 0  # number of added lines
        self.removed = 0  # number of removed lines
        self.source = []
        self.source_start = int(src_start)
        self.source_length = int(src_len)
        self.target = []
        self.target_start = int(tgt_start)
        self.target_length = int(tgt_len)
        self.section_header = section_header

    def __repr__(self):
        value = "<Hunk: @@ %d,%d %d,%d @@ %s>" % (self.source_start,
                                                  self.source_length,
                                                  self.target_start,
                                                  self.target_length,
                                                  self.section_header)
        return make_str(value)

    def __str__(self):
        # section header is optional and thus we output it only if it's present
        head = "@@ -%d,%d +%d,%d @@%s\n" % (
            self.source_start, self.source_length,
            self.target_start, self.target_length,
            ' ' + self.section_header if self.section_header else '')
        content = ''.join(unicode(line) for line in self)
        return head + content

    def append(self, line):
        """Append the line to hunk, and keep track of source/target lines."""
        super(Hunk, self).append(line)
        s = str(line)
        if line.is_added:
            self.added += 1
            self.target.append(s)
        elif line.is_removed:
            self.removed += 1
            self.source.append(s)
        elif line.is_context:
            self.target.append(s)
            self.source.append(s)

    def is_valid(self):
        """Check hunk header data matches entered lines info."""
        return (len(self.source) == self.source_length and
                len(self.target) == self.target_length)

    def source_lines(self):
        """Hunk lines from source file (generator)."""
        return (l for l in self if l.is_context or l.is_removed)

    def target_lines(self):
        """Hunk lines from target file (generator)."""
        return (l for l in self if l.is_context or l.is_added)


class PatchedFile(list):
    """Patch updated file, it is a list of Hunks."""

    def __init__(self, patch_info=None, source='', target='',
                 source_timestamp=None, target_timestamp=None):
        super(PatchedFile, self).__init__()
        self.patch_info = patch_info
        self.source_file = source
        self.source_timestamp = source_timestamp
        self.target_file = target
        self.target_timestamp = target_timestamp

    def __repr__(self):
        return make_str("<PatchedFile: %s>") % make_str(self.path)

    def __str__(self):
        # patch info is optional
        info = '' if self.patch_info is None else str(self.patch_info)
        source = "--- %s%s\n" % (
            self.source_file,
            '\t' + self.source_timestamp if self.source_timestamp else '')
        target = "+++ %s%s\n" % (
            self.target_file,
            '\t' + self.target_timestamp if self.target_timestamp else '')
        hunks = ''.join(unicode(hunk) for hunk in self)
        return info + source + target + hunks

    def _parse_hunk(self, header, diff, encoding):
        """Parse hunk details."""
        header_info = RE_HUNK_HEADER.match(header)
        hunk_info = header_info.groups()
        hunk = Hunk(*hunk_info)

        source_line_no = hunk.source_start
        target_line_no = hunk.target_start
        expected_source_end = source_line_no + hunk.source_length
        expected_target_end = target_line_no + hunk.target_length

        for diff_line_no, line in diff:
            if encoding is not None:
                line = line.decode(encoding)

            valid_line = RE_HUNK_EMPTY_BODY_LINE.match(line)
            if not valid_line:
                valid_line = RE_HUNK_BODY_LINE.match(line)

            if not valid_line:
                raise UnidiffParseError('Hunk diff line expected: %s' % line)

            line_type = valid_line.group('line_type')
            if line_type == LINE_TYPE_EMPTY:
                line_type = LINE_TYPE_CONTEXT
            value = valid_line.group('value')
            original_line = Line(value, line_type=line_type)
            if line_type == LINE_TYPE_ADDED:
                original_line.target_line_no = target_line_no
                target_line_no += 1
            elif line_type == LINE_TYPE_REMOVED:
                original_line.source_line_no = source_line_no
                source_line_no += 1
            elif line_type == LINE_TYPE_CONTEXT:
                original_line.target_line_no = target_line_no
                target_line_no += 1
                original_line.source_line_no = source_line_no
                source_line_no += 1
            elif line_type == LINE_TYPE_NO_NEWLINE:
                pass
            else:
                original_line = None

            # stop parsing if we got past expected number of lines
            if (source_line_no > expected_source_end or
                    target_line_no > expected_target_end):
                raise UnidiffParseError('Hunk is longer than expected')

            if original_line:
                original_line.diff_line_no = diff_line_no
                hunk.append(original_line)

            # if hunk source/target lengths are ok, hunk is complete
            if (source_line_no == expected_source_end and
                    target_line_no == expected_target_end):
                break

        # report an error if we haven't got expected number of lines
        if (source_line_no < expected_source_end or
                target_line_no < expected_target_end):
            raise UnidiffParseError('Hunk is shorter than expected')

        self.append(hunk)

    def _add_no_newline_marker_to_last_hunk(self):
        if not self:
            raise UnidiffParseError(
                'Unexpected marker:' + LINE_VALUE_NO_NEWLINE)
        last_hunk = self[-1]
        last_hunk.append(
            Line(LINE_VALUE_NO_NEWLINE + '\n', line_type=LINE_TYPE_NO_NEWLINE))

    def _append_trailing_empty_line(self):
        if not self:
            raise UnidiffParseError('Unexpected trailing newline character')
        last_hunk = self[-1]
        last_hunk.append(Line('\n', line_type=LINE_TYPE_EMPTY))

    @property
    def path(self):
        """Return the file path abstracted from VCS."""
        if (self.source_file.startswith('a/') and
                self.target_file.startswith('b/')):
            filepath = self.source_file[2:]
        elif (self.source_file.startswith('a/') and
              self.target_file == '/dev/null'):
            filepath = self.source_file[2:]
        elif (self.target_file.startswith('b/') and
              self.source_file == '/dev/null'):
            filepath = self.target_file[2:]
        else:
            filepath = self.source_file
        return filepath

    @property
    def added(self):
        """Return the file total added lines."""
        return sum([hunk.added for hunk in self])

    @property
    def removed(self):
        """Return the file total removed lines."""
        return sum([hunk.removed for hunk in self])

    @property
    def is_added_file(self):
        """Return True if this patch adds the file."""
        return (len(self) == 1 and self[0].source_start == 0 and
                self[0].source_length == 0)

    @property
    def is_removed_file(self):
        """Return True if this patch removes the file."""
        return (len(self) == 1 and self[0].target_start == 0 and
                self[0].target_length == 0)

    @property
    def is_modified_file(self):
        """Return True if this patch modifies the file."""
        return not (self.is_added_file or self.is_removed_file)


@implements_to_string
class PatchSet(list):
    """A list of PatchedFiles."""

    def __init__(self, f, encoding=None):
        super(PatchSet, self).__init__()

        # convert string inputs to StringIO objects
        if isinstance(f, basestring):
            f = self._convert_string(f, encoding)

        # make sure we pass an iterator object to parse
        data = iter(f)
        # if encoding is None, assume we are reading unicode data
        self._parse(data, encoding=encoding)

    def __repr__(self):
        return make_str('<PatchSet: %s>') % super(PatchSet, self).__repr__()

    def __str__(self):
        return ''.join(unicode(patched_file) for patched_file in self)

    def _parse(self, diff, encoding):
        current_file = None
        patch_info = None

        diff = enumerate(diff, 1)
        for unused_diff_line_no, line in diff:
            if encoding is not None:
                line = line.decode(encoding)

            # check for source file header
            is_source_filename = RE_SOURCE_FILENAME.match(line)
            if is_source_filename:
                source_file = is_source_filename.group('filename')
                source_timestamp = is_source_filename.group('timestamp')
                # reset current file
                current_file = None
                continue

            # check for target file header
            is_target_filename = RE_TARGET_FILENAME.match(line)
            if is_target_filename:
                if current_file is not None:
                    raise UnidiffParseError('Target without source: %s' % line)
                target_file = is_target_filename.group('filename')
                target_timestamp = is_target_filename.group('timestamp')
                # add current file to PatchSet
                current_file = PatchedFile(
                    patch_info, source_file, target_file,
                    source_timestamp, target_timestamp)
                self.append(current_file)
                patch_info = None
                continue

            # check for hunk header
            is_hunk_header = RE_HUNK_HEADER.match(line)
            if is_hunk_header:
                if current_file is None:
                    raise UnidiffParseError('Unexpected hunk found: %s' % line)
                current_file._parse_hunk(line, diff, encoding)
                continue

            # check for no newline marker
            is_no_newline = RE_NO_NEWLINE_MARKER.match(line)
            if is_no_newline:
                if current_file is None:
                    raise UnidiffParseError('Unexpected marker: %s' % line)
                current_file._add_no_newline_marker_to_last_hunk()
                continue

            # sometimes hunks can be followed by empty lines
            if line == '\n' and current_file is not None:
                current_file._append_trailing_empty_line()
                continue

            # if nothing has matched above then this line is a patch info
            if patch_info is None:
                current_file = None
                patch_info = PatchInfo()
            patch_info.append(line)

    @classmethod
    def from_filename(cls, filename, encoding=DEFAULT_ENCODING, errors=None):
        """Return a PatchSet instance given a diff filename."""
        with open_file(filename, 'r', encoding=encoding, errors=errors) as f:
            instance = cls(f)
        return instance

    @staticmethod
    def _convert_string(data, encoding=None, errors='strict'):
        if encoding is not None:
            # if encoding is given, assume bytes and decode
            data = unicode(data, encoding=encoding, errors=errors)
        return StringIO(data)

    @classmethod
    def from_string(cls, data, encoding=None, errors='strict'):
        """Return a PatchSet instance given a diff string."""
        return cls(cls._convert_string(data, encoding, errors))

    @property
    def added_files(self):
        """Return patch added files as a list."""
        return [f for f in self if f.is_added_file]

    @property
    def removed_files(self):
        """Return patch removed files as a list."""
        return [f for f in self if f.is_removed_file]

    @property
    def modified_files(self):
        """Return patch modified files as a list."""
        return [f for f in self if f.is_modified_file]

    @property
    def added(self):
        """Return the patch total added lines."""
        return sum([f.added for f in self])

    @property
    def removed(self):
        """Return the patch total removed lines."""
        return sum([f.removed for f in self])


================================================
FILE: devutils/update_lists.py
================================================
#!/usr/bin/env python3

# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""
Update binary pruning and domain substitution lists automatically.

It will download and unpack into the source tree as necessary.
No binary pruning or domain substitution will be applied to the source tree after
the process has finished.
"""

import argparse
import os
import sys

from pathlib import Path, PurePosixPath

sys.path.insert(0, str(Path(__file__).resolve().parent.parent / 'utils'))
from _common import get_logger
from domain_substitution import DomainRegexList, TREE_ENCODINGS
sys.path.pop(0)

# Encoding for output files
_ENCODING = 'UTF-8'

# NOTE: Include patterns have precedence over exclude patterns
# pathlib.Path.match() paths to include in binary pruning
PRUNING_INCLUDE_PATTERNS = [
    'components/domain_reliability/baked_in_configs/*',
    # Removals for patches/core/ungoogled-chromium/remove-unused-preferences-fields.patch
    'components/safe_browsing/core/common/safe_browsing_prefs.cc',
    'components/safe_browsing/core/common/safe_browsing_prefs.h',
    'components/signin/public/base/signin_pref_names.cc',
    'components/signin/public/base/signin_pref_names.h',
]

# pathlib.Path.match() paths to exclude from binary pruning
PRUNING_EXCLUDE_PATTERNS = [
    'chrome/common/win/eventlog_messages.mc', # TODO: False positive textfile
    # Exclude AFDO sample profile in binary format (Auto FDO)
    # Details: https://clang.llvm.org/docs/UsersManual.html#sample-profile-formats
    'chrome/android/profiles/afdo.prof',
    # TabRanker example preprocessor config
    # Details in chrome/browser/resource_coordinator/tab_ranker/README.md
    'chrome/browser/resource_coordinator/tab_ranker/example_preprocessor_config.pb',
    'chrome/browser/resource_coordinator/tab_ranker/pairwise_preprocessor_config.pb',
    # Exclusions for DOM distiller (contains model data only)
    'components/dom_distiller/core/data/distillable_page_model_new.bin',
    'components/dom_distiller/core/data/long_page_model.bin',
    # Exclusions for GeoLanguage data
    # Details: https://docs.google.com/document/d/18WqVHz5F9vaUiE32E8Ge6QHmku2QSJKvlqB9JjnIM-g/edit
    # Introduced with: https://chromium.googlesource.com/chromium/src/+/6647da61
    'components/language/content/browser/ulp_language_code_locator/geolanguage-data_rank0.bin',
    'components/language/content/browser/ulp_language_code_locator/geolanguage-data_rank1.bin',
    'components/language/content/browser/ulp_language_code_locator/geolanguage-data_rank2.bin',
    'third_party/icu/common/icudtl.dat', # Exclusion for ICU data
    # Exclusions for safe file extensions
    '*.ttf',
    '*.png',
    '*.jpg',
    '*.webp',
    '*.gif',
    '*.ico',
    '*.mp3',
    '*.wav',
    '*.flac',
    '*.icns',
    '*.woff',
    '*.woff2',
    '*makefile',
    '*.xcf',
    '*.cur',
    '*.pdf',
    '*.ai',
    '*.h',
    '*.c',
    '*.cpp',
    '*.cc',
    '*.mk',
    '*.bmp',
    '*.py',
    '*.xml',
    '*.html',
    '*.js',
    '*.json',
    '*.txt',
    '*.xtb'
]

# NOTE: Domain substitution path prefix exclusion has precedence over inclusion patterns
# Paths to exclude by prefixes of the POSIX representation for domain substitution
DOMAIN_EXCLUDE_PREFIXES = [
    'components/test/',
    'net/http/transport_security_state_static.json',
    # Exclusions for Visual Studio Project generation with GN (PR #445)
    'tools/gn/src/gn/visual_studio_writer.cc',
    # Exclusions for files covered with other patches/unnecessary
    'components/search_engines/prepopulated_engines.json',
    'third_party/blink/renderer/core/dom/document.cc',
]

# pathlib.Path.match() patterns to include in domain substitution
DOMAIN_INCLUDE_PATTERNS = [
    '*.h', '*.hh', '*.hpp', '*.hxx', '*.cc', '*.cpp', '*.cxx', '*.c', '*.h', '*.json', '*.js',
    '*.html', '*.htm', '*.css', '*.py*', '*.grd', '*.sql', '*.idl', '*.mk', '*.gyp*', 'makefile',
    '*.txt', '*.xml', '*.mm', '*.jinja*', '*.gn', '*.gni'
]

# Binary-detection constant
_TEXTCHARS = bytearray({7, 8, 9, 10, 12, 13, 27} | set(range(0x20, 0x100)) - {0x7f})


class UnusedPatterns: #pylint: disable=too-few-public-methods
    """Tracks unused prefixes and patterns"""

    _all_names = ('pruning_include_patterns', 'pruning_exclude_patterns', 'domain_include_patterns',
                  'domain_exclude_prefixes')

    def __init__(self):
        # Initialize all tracked patterns and prefixes in sets
        # Users will discard elements that are used
        for name in self._all_names:
            setattr(self, name, set(globals()[name.upper()]))

    def log_unused(self):
        """
        Logs unused patterns and prefixes

        Returns True if there are unused patterns or prefixes; False otherwise
        """
        have_unused = False
        for name in self._all_names:
            current_set = getattr(self, name, None)
            if current_set:
                get_logger().error('Unused from %s: %s', name.upper(), current_set)
                have_unused = True
        return have_unused


def _is_binary(bytes_data):
    """
    Returns True if the data seems to be binary data (i.e. not human readable); False otherwise
    """
    # From: https://stackoverflow.com/a/7392391
    return bool(bytes_data.translate(None, _TEXTCHARS))


def _dir_empty(path):
    """
    Returns True if the directory is empty; False otherwise

    path is a pathlib.Path or string to a directory to test.
    """
    try:
        next(os.scandir(str(path)))
    except StopIteration:
        return True
    return False


def should_prune(path, relative_path, unused_patterns):
    """
    Returns True if a path should be pruned from the source tree; False otherwise

    path is the pathlib.Path to the file from the current working directory.
    relative_path is the pathlib.Path to the file from the source tree
    unused_patterns is a UnusedPatterns object
    """
    # Match against include patterns
    for pattern in PRUNING_INCLUDE_PATTERNS:
        if relative_path.match(pattern):
            unused_patterns.pruning_include_patterns.discard(pattern)
            return True

    # Match against exclude patterns
    for pattern in PRUNING_EXCLUDE_PATTERNS:
        if Path(str(relative_path).lower()).match(pattern):
            unused_patterns.pruning_exclude_patterns.discard(pattern)
            return False

    # Do binary data detection
    with path.open('rb') as file_obj:
        if _is_binary(file_obj.read()):
            return True

    # Passed all filtering; do not prune
    return False


def _check_regex_match(file_path, search_regex):
    """
    Returns True if a regex pattern matches a file; False otherwise

    file_path is a pathlib.Path to the file to test
    search_regex is a compiled regex object to search for domain names
    """
    with file_path.open("rb") as file_obj:
        file_bytes = file_obj.read()
        content = None
        for encoding in TREE_ENCODINGS:
            try:
                content = file_bytes.decode(encoding)
                break
            except UnicodeDecodeError:
                continue
        if not search_regex.search(content) is None:
            return True
    return False


def should_domain_substitute(path, relative_path, search_regex, unused_patterns):
    """
    Returns True if a path should be domain substituted in the source tree; False otherwise

    path is the pathlib.Path to the file from the current working directory.
    relative_path is the pathlib.Path to the file from the source tree.
    search_regex is a compiled regex object to search for domain names
    unused_patterns is a UnusedPatterns object
    """
    relative_path_posix = relative_path.as_posix().lower()
    for include_pattern in DOMAIN_INCLUDE_PATTERNS:
        if PurePosixPath(relative_path_posix).match(include_pattern):
            unused_patterns.domain_include_patterns.discard(include_pattern)
            for exclude_prefix in DOMAIN_EXCLUDE_PREFIXES:
                if relative_path_posix.startswith(exclude_prefix):
                    unused_patterns.domain_exclude_prefixes.discard(exclude_prefix)
                    return False
            return _check_regex_match(path, search_regex)
    return False


def compute_lists(source_tree, search_regex):
    """
    Compute the binary pruning and domain substitution lists of the source tree.
    Returns a tuple of two items in the following order:
    1. The sorted binary pruning list
    2. The sorted domain substitution list

    source_tree is a pathlib.Path to the source tree
    search_regex is a compiled regex object to search for domain names
    """
    pruning_set = set()
    domain_substitution_set = set()
    deferred_symlinks = dict() # POSIX resolved path -> set of POSIX symlink paths
    source_tree = source_tree.resolve()
    unused_patterns = UnusedPatterns()

    for path in source_tree.rglob('*'):
        if not path.is_file():
            # NOTE: Path.rglob() does not traverse symlink dirs; no need for special handling
            continue
        relative_path = path.relative_to(source_tree)
        if path.is_symlink():
            try:
                resolved_relative_posix = path.resolve().relative_to(source_tree).as_posix()
            except ValueError:
                # Symlink leads out of the source tree
                continue
            if resolved_relative_posix in pruning_set:
                pruning_set.add(relative_path.as_posix())
            else:
                symlink_set = deferred_symlinks.get(resolved_relative_posix, None)
                if symlink_set is None:
                    symlink_set = set()
                    deferred_symlinks[resolved_relative_posix] = symlink_set
                symlink_set.add(relative_path.as_posix())
            # Path has finished processing because...
            # Pruning: either symlink has been added or removal determination has been deferred
            # Domain substitution: Only the real paths can be added, not symlinks
            continue
        try:
            if should_prune(path, relative_path, unused_patterns):
                relative_posix_path = relative_path.as_posix()
                pruning_set.add(relative_posix_path)
                symlink_set = deferred_symlinks.pop(relative_posix_path, tuple())
                if symlink_set:
                    pruning_set.update(symlink_set)
            elif should_domain_substitute(path, relative_path, search_regex, unused_patterns):
                domain_substitution_set.add(relative_path.as_posix())
        except: #pylint: disable=bare-except
            get_logger().exception('Unhandled exception while processing %s', relative_path)
            exit(1)
    return sorted(pruning_set), sorted(domain_substitution_set), unused_patterns


def main(args_list=None):
    """CLI entrypoint"""
    parser = argparse.ArgumentParser(description=__doc__)
    parser.add_argument(
        '--pruning',
        metavar='PATH',
        type=Path,
        default='pruning.list',
        help='The path to store pruning.list. Default: %(default)s')
    parser.add_argument(
        '--domain-substitution',
        metavar='PATH',
        type=Path,
        default='domain_substitution.list',
        help='The path to store domain_substitution.list. Default: %(default)s')
    parser.add_argument(
        '--domain-regex',
        metavar='PATH',
        type=Path,
        default='domain_regex.list',
        help='The path to domain_regex.list. Default: %(default)s')
    parser.add_argument(
        '-t',
        '--tree',
        metavar='PATH',
        type=Path,
        required=True,
        help='The path to the source tree to use.')
    args = parser.parse_args(args_list)
    if args.tree.exists() and not _dir_empty(args.tree):
        get_logger().info('Using existing source tree at %s', args.tree)
    else:
        get_logger().error('No source tree found. Aborting.')
        exit(1)
    get_logger().info('Computing lists...')
    pruning_list, domain_substitution_list, unused_patterns = compute_lists(
        args.tree,
        DomainRegexList(args.domain_regex).search_regex)
    with args.pruning.open('w', encoding=_ENCODING) as file_obj:
        file_obj.writelines('%s\n' % line for line in pruning_list)
    with args.domain_substitution.open('w', encoding=_ENCODING) as file_obj:
        file_obj.writelines('%s\n' % line for line in domain_substitution_list)
    if unused_patterns.log_unused():
        get_logger().error('Please update or remove unused patterns and/or prefixes. '
                           'The lists have still been updated with the remaining valid entries.')
        exit(1)


if __name__ == "__main__":
    main()


================================================
FILE: devutils/update_platform_patches.py
================================================
#!/usr/bin/env python3
# -*- coding: utf-8 -*-

# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""
Utility to ease the updating of platform patches against ungoogled-chromium's patches
"""

import argparse
import os
import shutil
import sys
from pathlib import Path

sys.path.insert(0, str(Path(__file__).resolve().parent.parent / 'utils'))
from _common import ENCODING, get_logger
from patches import merge_patches
sys.path.pop(0)

_SERIES = 'series'
_SERIES_ORIG = 'series.orig'
_SERIES_PREPEND = 'series.prepend'
_SERIES_MERGED = 'series.merged'


def merge_platform_patches(platform_patches_dir, prepend_patches_dir):
    '''
    Prepends prepend_patches_dir into platform_patches_dir

    Returns True if successful, False otherwise
    '''
    if not (platform_patches_dir / _SERIES).exists():
        get_logger().error('Unable to find platform series file: %s',
                           platform_patches_dir / _SERIES)
        return False

    # Make series.orig file
    shutil.copyfile(str(platform_patches_dir / _SERIES), str(platform_patches_dir / _SERIES_ORIG))

    # Make series.prepend
    shutil.copyfile(str(prepend_patches_dir / _SERIES), str(platform_patches_dir / _SERIES_PREPEND))

    # Merge patches
    merge_patches([prepend_patches_dir], platform_patches_dir, prepend=True)
    (platform_patches_dir / _SERIES).replace(platform_patches_dir / _SERIES_MERGED)

    return True


def _dir_empty(path):
    '''
    Returns True if the directory exists and is empty; False otherwise
    '''
    try:
        next(os.scandir(str(path)))
    except StopIteration:
        return True
    except FileNotFoundError:
        pass
    return False


def _remove_files_with_dirs(root_dir, sorted_file_iter):
    '''
    Deletes a list of sorted files relative to root_dir, removing empty directories along the way
    '''
    past_parent = None
    for partial_path in sorted_file_iter:
        complete_path = Path(root_dir, partial_path)
        try:
            complete_path.unlink()
        except FileNotFoundError:
            get_logger().warning('Could not remove prepended patch: %s', complete_path)
        if past_parent != complete_path.parent:
            while past_parent and _dir_empty(past_parent):
                past_parent.rmdir()
                past_parent = past_parent.parent
            past_parent = complete_path.parent
    # Handle last path's directory
    while _dir_empty(complete_path.parent):
        complete_path.parent.rmdir()
        complete_path = complete_path.parent


def unmerge_platform_patches(platform_patches_dir):
    '''
    Undo merge_platform_patches(), adding any new patches from series.merged as necessary

    Returns True if successful, False otherwise
    '''
    if not (platform_patches_dir / _SERIES_PREPEND).exists():
        get_logger().error('Unable to find series.prepend at: %s',
                           platform_patches_dir / _SERIES_PREPEND)
        return False
    prepend_series = set(
        filter(len,
               (platform_patches_dir / _SERIES_PREPEND).read_text(encoding=ENCODING).splitlines()))

    # Remove prepended files with directories
    _remove_files_with_dirs(platform_patches_dir, sorted(prepend_series))

    # Determine positions of blank spaces in series.orig
    if not (platform_patches_dir / _SERIES_ORIG).exists():
        get_logger().error('Unable to find series.orig at: %s', platform_patches_dir / _SERIES_ORIG)
        return False
    orig_series = (platform_patches_dir / _SERIES_ORIG).read_text(encoding=ENCODING).splitlines()
    # patch path -> list of lines after patch path and before next patch path
    path_comments = dict()
    # patch path -> inline comment for patch
    path_inline_comments = dict()
    previous_path = None
    for partial_path in orig_series:
        if not partial_path or partial_path.startswith('#'):
            if partial_path not in path_comments:
                path_comments[previous_path] = list()
            path_comments[previous_path].append(partial_path)
        else:
            path_parts = partial_path.split(' #', maxsplit=1)
            previous_path = path_parts[0]
            if len(path_parts) == 2:
                path_inline_comments[path_parts[0]] = path_parts[1]

    # Apply changes on series.merged into a modified version of series.orig
    if not (platform_patches_dir / _SERIES_MERGED).exists():
        get_logger().error('Unable to find series.merged at: %s',
                           platform_patches_dir / _SERIES_MERGED)
        return False
    new_series = filter(
        len, (platform_patches_dir / _SERIES_MERGED).read_text(encoding=ENCODING).splitlines())
    new_series = filter((lambda x: x not in prepend_series), new_series)
    new_series = list(new_series)
    series_index = 0
    while series_index < len(new_series):
        current_path = new_series[series_index]
        if current_path in path_inline_comments:
            new_series[series_index] = current_path + ' #' + path_inline_comments[current_path]
        if current_path in path_comments:
            new_series.insert(series_index + 1, '\n'.join(path_comments[current_path]))
            series_index += 1
        series_index += 1

    # Write series file
    with (platform_patches_dir / _SERIES).open('w', encoding=ENCODING) as series_file:
        series_file.write('\n'.join(new_series))
        series_file.write('\n')

    # All other operations are successful; remove merging intermediates
    (platform_patches_dir / _SERIES_MERGED).unlink()
    (platform_patches_dir / _SERIES_ORIG).unlink()
    (platform_patches_dir / _SERIES_PREPEND).unlink()

    return True


def main():
    """CLI Entrypoint"""
    parser = argparse.ArgumentParser(description=__doc__)
    parser.add_argument(
        'command',
        choices=('merge', 'unmerge'),
        help='Merge or unmerge ungoogled-chromium patches with platform patches')
    parser.add_argument(
        'platform_patches',
        type=Path,
        help='The path to the platform patches in GNU Quilt format to merge into')
    args = parser.parse_args()

    repo_dir = Path(__file__).resolve().parent.parent

    success = False
    if args.command == 'merge':
        success = merge_platform_patches(args.platform_patches, repo_dir / 'patches')
    elif args.command == 'unmerge':
        success = unmerge_platform_patches(args.platform_patches)
    else:
        raise NotImplementedError(args.command)

    if success:
        return 0
    return 1


if __name__ == '__main__':
    exit(main())


================================================
FILE: devutils/validate_config.py
================================================
#!/usr/bin/env python3
# -*- coding: UTF-8 -*-

# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""Run sanity checking algorithms over ungoogled-chromium's config files

NOTE: This script is hardcoded to run over ungoogled-chromium's config files only.
To check other files, use the other scripts imported by this script.

It checks the following:

    * All patches exist
    * All patches are referenced by the patch order
    * Each patch is used only once
    * GN flags in flags.gn are sorted and not duplicated
    * downloads.ini has the correct format (i.e. conforms to its schema)

Exit codes:
    * 0 if no problems detected
    * 1 if warnings or errors occur
"""

import sys
from pathlib import Path

from check_downloads_ini import check_downloads_ini
from check_gn_flags import check_gn_flags
from check_patch_files import (check_patch_readability, check_series_duplicates,
                               check_unused_patches)


def main():
    """CLI entrypoint"""

    warnings = False
    root_dir = Path(__file__).resolve().parent.parent
    patches_dir = root_dir / 'patches'

    # Check patches
    warnings |= check_patch_readability(patches_dir)
    warnings |= check_series_duplicates(patches_dir)
    warnings |= check_unused_patches(patches_dir)

    # Check GN flags
    warnings |= check_gn_flags(root_dir / 'flags.gn')

    # Check downloads.ini
    warnings |= check_downloads_ini([root_dir / 'downloads.ini'])

    if warnings:
        exit(1)
    exit(0)


if __name__ == '__main__':
    if sys.argv[1:]:
        print(__doc__)
    else:
        main()


================================================
FILE: devutils/validate_patches.py
================================================
#!/usr/bin/env python3
# -*- coding: utf-8 -*-

# Copyright (c) 2020 The ungoogled-chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""
Validates that all patches apply cleanly against the source tree.

The required source tree files can be retrieved from Google directly.
"""

import argparse
import ast
import base64
import email.utils
import json
import logging
import sys
import tempfile
from pathlib import Path

sys.path.insert(0, str(Path(__file__).resolve().parent / 'third_party'))
import unidiff
from unidiff.constants import LINE_TYPE_EMPTY, LINE_TYPE_NO_NEWLINE
sys.path.pop(0)

sys.path.insert(0, str(Path(__file__).resolve().parent.parent / 'utils'))
from domain_substitution import TREE_ENCODINGS
from _common import ENCODING, get_logger, get_chromium_version, parse_series, add_common_params
from patches import dry_run_check
sys.path.pop(0)

try:
    import requests
    import requests.adapters
    import urllib3.util

    class _VerboseRetry(urllib3.util.Retry):
        """A more verbose version of HTTP Adatper about retries"""

        def sleep_for_retry(self, response=None):
            """Sleeps for Retry-After, and logs the sleep time"""
            if response:
                retry_after = self.get_retry_after(response)
                if retry_after:
                    get_logger().info(
                        'Got HTTP status %s with Retry-After header. Retrying after %s seconds...',
                        response.status, retry_after)
                else:
                    get_logger().info(
                        'Could not find Retry-After header for HTTP response %s. Status reason: %s',
                        response.status, response.reason)
            return super().sleep_for_retry(response)

        def _sleep_backoff(self):
            """Log info about backoff sleep"""
            get_logger().info('Running HTTP request sleep backoff')
            super()._sleep_backoff()

    def _get_requests_session():
        session = requests.Session()
        http_adapter = requests.adapters.HTTPAdapter(
            max_retries=_VerboseRetry(
                total=10,
                read=10,
                connect=10,
                backoff_factor=8,
                status_forcelist=urllib3.Retry.RETRY_AFTER_STATUS_CODES,
                raise_on_status=False))
        session.mount('http://', http_adapter)
        session.mount('https://', http_adapter)
        return session
except ImportError:

    def _get_requests_session():
        raise RuntimeError('The Python module "requests" is required for remote'
                           'file downloading. It can be installed from PyPI.')


_ROOT_DIR = Path(__file__).resolve().parent.parent
_SRC_PATH = Path('src')


class _PatchValidationError(Exception):
    """Raised when patch validation fails"""


class _UnexpectedSyntaxError(RuntimeError):
    """Raised when unexpected syntax is used in DEPS"""


class _NotInRepoError(RuntimeError):
    """Raised when the remote file is not present in the given repo"""


class _DepsNodeVisitor(ast.NodeVisitor):
    _valid_syntax_types = (ast.mod, ast.expr_context, ast.boolop, ast.Assign, ast.Add, ast.Name,
                           ast.Dict, ast.Str, ast.NameConstant, ast.List, ast.BinOp)
    _allowed_callables = ('Var', )

    def visit_Call(self, node): #pylint: disable=invalid-name
        """Override Call syntax handling"""
        if node.func.id not in self._allowed_callables:
            raise _UnexpectedSyntaxError('Unexpected call of "%s" at line %s, column %s' %
                                         (node.func.id, node.lineno, node.col_offset))

    def generic_visit(self, node):
        for ast_type in self._valid_syntax_types:
            if isinstance(node, ast_type):
                super().generic_visit(node)
                return
        raise _UnexpectedSyntaxError('Unexpected {} at line {}, column {}'.format(
            type(node).__name__, node.lineno, node.col_offset))


def _validate_deps(deps_text):
    """Returns True if the DEPS file passes validation; False otherwise"""
    try:
        _DepsNodeVisitor().visit(ast.parse(deps_text))
    except _UnexpectedSyntaxError as exc:
        get_logger().error('%s', exc)
        return False
    return True


def _deps_var(deps_globals):
    """Return a function that implements DEPS's Var() function"""

    def _var_impl(var_name):
        """Implementation of Var() in DEPS"""
        return deps_globals['vars'][var_name]

    return _var_impl


def _parse_deps(deps_text):
    """Returns a dict of parsed DEPS data"""
    deps_globals = {'__builtins__': None}
    deps_globals['Var'] = _deps_var(deps_globals)
    exec(deps_text, deps_globals) #pylint: disable=exec-used
    return deps_globals


def _download_googlesource_file(download_session, repo_url, version, relative_path):
    """
    Returns the contents of the text file with path within the given
    googlesource.com repo as a string.
    """
    if 'googlesource.com' not in repo_url:
        raise ValueError('Repository URL is not a googlesource.com URL: {}'.format(repo_url))
    full_url = repo_url + '/+/{}/{}?format=TEXT'.format(version, str(relative_path))
    get_logger().debug('Downloading: %s', full_url)
    response = download_session.get(full_url)
    if response.status_code == 404:
        raise _NotInRepoError()
    response.raise_for_status()
    # Assume all files that need patching are compatible with UTF-8
    return base64.b64decode(response.text, validate=True).decode('UTF-8')


def _get_dep_value_url(deps_globals, dep_value):
    """Helper for _process_deps_entries"""
    if isinstance(dep_value, str):
        url = dep_value
    elif isinstance(dep_value, dict):
        if 'url' not in dep_value:
            # Ignore other types like CIPD since
            # it probably isn't necessary
            return None
        url = dep_value['url']
    else:
        raise NotImplementedError()
    if '{' in url:
        # Probably a Python format string
        url = url.format(**deps_globals['vars'])
    if url.count('@') != 1:
        raise _PatchValidationError('Invalid number of @ symbols in URL: {}'.format(url))
    return url


def _process_deps_entries(deps_globals, child_deps_tree, child_path, deps_use_relative_paths):
    """Helper for _get_child_deps_tree"""
    for dep_path_str, dep_value in deps_globals.get('deps', dict()).items():
        url = _get_dep_value_url(deps_globals, dep_value)
        if url is None:
            continue
        dep_path = Path(dep_path_str)
        if not deps_use_relative_paths:
            try:
                dep_path = Path(dep_path_str).relative_to(child_path)
            except ValueError:
                # Not applicable to the current DEPS tree path
                continue
        grandchild_deps_tree = None # Delaying creation of dict() until it's needed
        for recursedeps_item in deps_globals.get('recursedeps', tuple()):
            if isinstance(recursedeps_item, str):
                if recursedeps_item == str(dep_path):
                    grandchild_deps_tree = 'DEPS'
            else: # Some sort of iterable
                recursedeps_item_path, recursedeps_item_depsfile = recursedeps_item
                if recursedeps_item_path == str(dep_path):
                    grandchild_deps_tree = recursedeps_item_depsfile
        if grandchild_deps_tree is None:
            # This dep is not recursive; i.e. it is fully loaded
            grandchild_deps_tree = dict()
        child_deps_tree[dep_path] = (*url.split('@'), grandchild_deps_tree)


def _get_child_deps_tree(download_session, current_deps_tree, child_path, deps_use_relative_paths):
    """Helper for _download_source_file"""
    repo_url, version, child_deps_tree = current_deps_tree[child_path]
    if isinstance(child_deps_tree, str):
        # Load unloaded DEPS
        deps_globals = _parse_deps(
            _download_googlesource_file(download_session, repo_url, version, child_deps_tree))
        child_deps_tree = dict()
        current_deps_tree[child_path] = (repo_url, version, child_deps_tree)
        deps_use_relative_paths = deps_globals.get('use_relative_paths', False)
        _process_deps_entries(deps_globals, child_deps_tree, child_path, deps_use_relative_paths)
    return child_deps_tree, deps_use_relative_paths


def _get_last_chromium_modification():
    """Returns the last modification date of the chromium-browser-official tar file"""
    with _get_requests_session() as session:
        response = session.head(
            'https://storage.googleapis.com/chromium-browser-official/chromium-{}.tar.xz'.format(
                get_chromium_version()))
        response.raise_for_status()
        return email.utils.parsedate_to_datetime(response.headers['Last-Modified'])


def _get_gitiles_git_log_date(log_entry):
    """Helper for _get_gitiles_git_log_date"""
    return email.utils.parsedate_to_datetime(log_entry['committer']['time'])


def _get_gitiles_commit_before_date(repo_url, target_branch, target_datetime):
    """Returns the hexadecimal hash of the closest commit before target_datetime"""
    json_log_url = '{repo}/+log/{branch}?format=JSON'.format(repo=repo_url, branch=target_branch)
    with _get_requests_session() as session:
        response = session.get(json_log_url)
        response.raise_for_status()
        git_log = json.loads(response.text[5:]) # Trim closing delimiters for various structures
    assert len(git_log) == 2 # 'log' and 'next' entries
    assert 'log' in git_log
    assert git_log['log']
    git_log = git_log['log']
    # Check boundary conditions
    if _get_gitiles_git_log_date(git_log[0]) < target_datetime:
        # Newest commit is older than target datetime
        return git_log[0]['commit']
    if _get_gitiles_git_log_date(git_log[-1]) > target_datetime:
        # Oldest commit is newer than the target datetime; assume oldest is close enough.
        get_logger().warning('Oldest entry in gitiles log for repo "%s" is newer than target; '
                             'continuing with oldest entry...')
        return git_log[-1]['commit']
    # Do binary search
    low_index = 0
    high_index = len(git_log) - 1
    mid_index = high_index
    while low_index != high_index:
        mid_index = low_index + (high_index - low_index) // 2
        if _get_gitiles_git_log_date(git_log[mid_index]) > target_datetime:
            low_index = mid_index + 1
        else:
            high_index = mid_index
    return git_log[mid_index]['commit']


class _FallbackRepoManager:
    """Retrieves fallback repos and caches data needed for determining repos"""

    _GN_REPO_URL = 'https://gn.googlesource.com/gn.git'

    def __init__(self):
        self._cache_gn_version = None

    @property
    def gn_version(self):
        """
        Returns the version of the GN repo for the Chromium version used by this code
        """
        if not self._cache_gn_version:
            # Because there seems to be no reference to the logic for generating the
            # chromium-browser-official tar file, it's possible that it is being generated
            # by an internal script that manually injects the GN repository files.
            # Therefore, assume that the GN version used in the chromium-browser-official tar
            # files correspond to the latest commit in the master branch of the GN repository
            # at the time of the tar file's generation. We can get an approximation for the
            # generation time by using the last modification date of the tar file on
            # Google's file server.
            self._cache_gn_version = _get_gitiles_commit_before_date(
                self._GN_REPO_URL, 'master', _get_last_chromium_modification())
        return self._cache_gn_version

    def get_fallback(self, current_relative_path, current_node, root_deps_tree):
        """
        Helper for _download_source_file

        It returns a new (repo_url, version, new_relative_path) to attempt a file download with
        """
        assert len(current_node) == 3
        # GN special processing
        try:
            new_relative_path = current_relative_path.relative_to('tools/gn')
        except ValueError:
            pass
        else:
            if current_node is root_deps_tree[_SRC_PATH]:
                get_logger().info('Redirecting to GN repo version %s for path: %s', self.gn_version,
                                  current_relative_path)
                return (self._GN_REPO_URL, self.gn_version, new_relative_path)
        return None, None, None


def _get_target_file_deps_node(download_session, root_deps_tree, target_file):
    """
    Helper for _download_source_file

    Returns the corresponding repo containing target_file based on the DEPS tree
    """
    # The "deps" from the current DEPS file
    current_deps_tree = root_deps_tree
    current_node = None
    # Path relative to the current node (i.e. DEPS file)
    current_relative_path = Path('src', target_file)
    previous_relative_path = None
    deps_use_relative_paths = False
    child_path = None
    while current_relative_path != previous_relative_path:
        previous_relative_path = current_relative_path
        for child_path in current_deps_tree:
            try:
                current_relative_path = previous_relative_path.relative_to(child_path)
            except ValueError:
                # previous_relative_path does not start with child_path
                continue
            current_node = current_deps_tree[child_path]
            # current_node will match with current_deps_tree after the following statement
            current_deps_tree, deps_use_relative_paths = _get_child_deps_tree(
                download_session, current_deps_tree, child_path, deps_use_relative_paths)
            break
    assert not current_node is None
    return current_node, current_relative_path


def _download_source_file(download_session, root_deps_tree, fallback_repo_manager, target_file):
    """
    Downloads the source tree file from googlesource.com

    download_session is an active requests.Session() object
    deps_dir is a pathlib.Path to the directory containing a DEPS file.
    """
    current_node, current_relative_path = _get_target_file_deps_node(download_session,
                                                                     root_deps_tree, target_file)
    # Attempt download with potential fallback logic
    repo_url, version, _ = current_node
    try:
        # Download with DEPS-provided repo
        return _download_googlesource_file(download_session, repo_url, version,
                                           current_relative_path)
    except _NotInRepoError:
        pass
    get_logger().debug(
        'Path "%s" (relative: "%s") not found using DEPS tree; finding fallback repo...',
        target_file, current_relative_path)
    repo_url, version, current_relative_path = fallback_repo_manager.get_fallback(
        current_relative_path, current_node, root_deps_tree)
    if not repo_url:
        get_logger().error('No fallback repo found for "%s" (relative: "%s")', target_file,
                           current_relative_path)
        raise _NotInRepoError()
    try:
        # Download with fallback repo
        return _download_googlesource_file(download_session, repo_url, version,
                                           current_relative_path)
    except _NotInRepoError:
        pass
    get_logger().error('File "%s" (relative: "%s") not found in fallback repo "%s", version "%s"',
                       target_file, current_relative_path, repo_url, version)
    raise _NotInRepoError()


def _initialize_deps_tree():
    """
    Initializes and returns a dependency tree for DEPS files

    The DEPS tree is a dict has the following format:
    key - pathlib.Path relative to the DEPS file's path
    value - tuple(repo_url, version, recursive dict here)
        repo_url is the URL to the dependency's repository root
        If the recursive dict is a string, then it is a string to the DEPS file to load
            if needed

    download_session is an active requests.Session() object
    """
    root_deps_tree = {
        _SRC_PATH: ('https://chromium.googlesource.com/chromium/src.git', get_chromium_version(),
                    'DEPS')
    }
    return root_deps_tree


def _retrieve_remote_files(file_iter):
    """
    Retrieves all file paths in file_iter from Google

    file_iter is an iterable of strings that are relative UNIX paths to
        files in the Chromium source.

    Returns a dict of relative UNIX path strings to a list of lines in the file as strings
    """

    files = dict()

    root_deps_tree = _initialize_deps_tree()

    try:
        total_files = len(file_iter)
    except TypeError:
        total_files = None

    logger = get_logger()
    if total_files is None:
        logger.info('Downloading remote files...')
    else:
        logger.info('Downloading %d remote files...', total_files)
    last_progress = 0
    file_count = 0
    fallback_repo_manager = _FallbackRepoManager()
    with _get_requests_session() as download_session:
        download_session.stream = False # To ensure connection to Google can be reused
        for file_path in file_iter:
            if total_files:
                file_count += 1
                current_progress = file_count * 100 // total_files // 5 * 5
                if current_progress != last_progress:
                    last_progress = current_progress
                    logger.info('%d%% downloaded', current_progress)
            else:
                current_progress = file_count // 20 * 20
                if current_progress != last_progress:
                    last_progress = current_progress
                    logger.info('%d files downloaded', current_progress)
            try:
                files[file_path] = _download_source_file(
                    download_session, root_deps_tree, fallback_repo_manager, file_path).split('\n')
            except _NotInRepoError:
                get_logger().warning('Could not find "%s" remotely. Skipping...', file_path)
    return files


def _retrieve_local_files(file_iter, source_dir):
    """
    Retrieves all file paths in file_iter from the local source tree

    file_iter is an iterable of strings that are relative UNIX paths to
        files in the Chromium source.

    Returns a dict of relative UNIX path strings to a list of lines in the file as strings
    """
    files = dict()
    for file_path in file_iter:
        try:
            raw_content = (source_dir / file_path).read_bytes()
        except FileNotFoundError:
            get_logger().warning('Missing file from patches: %s', file_path)
            continue
        for encoding in TREE_ENCODINGS:
            try:
                content = raw_content.decode(encoding)
                break
            except UnicodeDecodeError:
                continue
        if not content:
            raise UnicodeDecodeError('Unable to decode with any encoding: %s' % file_path)
        files[file_path] = content.split('\n')
    if not files:
        get_logger().error('All files used by patches are missing!')
    return files


def _modify_file_lines(patched_file, file_lines):
    """Helper for _apply_file_unidiff"""
    # Cursor for keeping track of the current line during hunk application
    # NOTE: The cursor is based on the line list index, not the line number!
    line_cursor = None
    for hunk in patched_file:
        # Validate hunk will match
        if not hunk.is_valid():
            raise _PatchValidationError('Hunk is not valid: {}'.format(repr(hunk)))
        line_cursor = hunk.target_start - 1
        for line in hunk:
            normalized_line = line.value.rstrip('\n')
            if line.is_added:
                file_lines[line_cursor:line_cursor] = (normalized_line, )
                line_cursor += 1
            elif line.is_removed:
                if normalized_line != file_lines[line_cursor]:
                    raise _PatchValidationError(
                        "Line '{}' does not match removal line '{}' from patch".format(
                            file_lines[line_cursor], normalized_line))
                del file_lines[line_cursor]
            elif line.is_context:
                if not normalized_line and line_cursor == len(file_lines):
                    # We reached the end of the file
                    break
                if normalized_line != file_lines[line_cursor]:
                    raise _PatchValidationError(
                        "Line '{}' does not match context line '{}' from patch".format(
                            file_lines[line_cursor], normalized_line))
                line_cursor += 1
            else:
                assert line.line_type in (LINE_TYPE_EMPTY, LINE_TYPE_NO_NEWLINE)


def _apply_file_unidiff(patched_file, files_under_test):
    """Applies the unidiff.PatchedFile to the source files under testing"""
    patched_file_path = Path(patched_file.path)
    if patched_file.is_added_file:
        if patched_file_path in files_under_test:
            assert files_under_test[patched_file_path] is None
        assert len(patched_file) == 1 # Should be only one hunk
        assert patched_file[0].removed == 0
        assert patched_file[0].target_start == 1
        files_under_test[patched_file_path] = [x.value for x in patched_file[0]]
    elif patched_file.is_removed_file:
        # Remove lines to see if file to be removed matches patch
        _modify_file_lines(patched_file, files_under_test[patched_file_path])
        files_under_test[patched_file_path] = None
    else: # Patching an existing file
        assert patched_file.is_modified_file
        _modify_file_lines(patched_file, files_under_test[patched_file_path])


def _dry_check_patched_file(patched_file, orig_file_content):
    """Run "patch --dry-check" on a unidiff.PatchedFile for diagnostics"""
    with tempfile.TemporaryDirectory() as tmpdirname:
        tmp_dir = Path(tmpdirname)
        # Write file to patch
        patched_file_path = tmp_dir / patched_file.path
        patched_file_path.parent.mkdir(parents=True, exist_ok=True)
        patched_file_path.write_text(orig_file_content)
        # Write patch
        patch_path = tmp_dir / 'broken_file.patch'
        patch_path.write_text(str(patched_file))
        # Dry run
        _, dry_stdout, _ = dry_run_check(patch_path, tmp_dir)
        return dry_stdout


def _test_patches(series_iter, patch_cache, files_under_test):
    """
    Tests the patches specified in the iterable series_iter

    Returns a boolean indicating if any of the patches have failed
    """
    for patch_path_str in series_iter:
        for patched_file in patch_cache[patch_path_str]:
            orig_file_content = None
            if get_logger().isEnabledFor(logging.DEBUG):
                orig_file_content = files_under_test.get(Path(patched_file.path))
                if orig_file_content:
                    orig_file_content = ' '.join(orig_file_content)
            try:
                _apply_file_unidiff(patched_file, files_under_test)
            except _PatchValidationError as exc:
                get_logger().warning('Patch failed validation: %s', patch_path_str)
                get_logger().debug('Specifically, file "%s" failed validation: %s',
                                   patched_file.path, exc)
                if get_logger().isEnabledFor(logging.DEBUG):
                    # _PatchValidationError cannot be thrown when a file is added
                    assert patched_file.is_modified_file or patched_file.is_removed_file
                    assert orig_file_content is not None
                    get_logger().debug(
                        'Output of "patch --dry-run" for this patch on this file:\n%s',
                        _dry_check_patched_file(patched_file, orig_file_content))
                return True
            except: #pylint: disable=bare-except
                get_logger().warning('Patch failed validation: %s', patch_path_str)
                get_logger().debug(
                    'Specifically, file "%s" caused exception while applying:',
                    patched_file.path,
                    exc_info=True)
                return True
    return False


def _load_all_patches(series_iter, patches_dir):
    """
    Returns a tuple of the following:
    - boolean indicating success or failure of reading files
    - dict of relative UNIX path strings to unidiff.PatchSet
    """
    had_failure = False
    unidiff_dict = dict()
    for relative_path in series_iter:
        if relative_path in unidiff_dict:
            continue
        unidiff_dict[relative_path] = unidiff.PatchSet.from_filename(
            str(patches_dir / relative_path), encoding=ENCODING)
        if not (patches_dir / relative_path).read_text(encoding=ENCODING).endswith('\n'):
            had_failure = True
            get_logger().warning('Patch file does not end with newline: %s',
                                 str(patches_dir / relative_path))
    return had_failure, unidiff_dict


def _get_required_files(patch_cache):
    """Returns an iterable of pathlib.Path files needed from the source tree for patching"""
    new_files = set() # Files introduced by patches
    file_set = set()
    for patch_set in patch_cache.values():
        for patched_file in patch_set:
            if patched_file.is_added_file:
                new_files.add(patched_file.path)
            elif patched_file.path not in new_files:
                file_set.add(Path(patched_file.path))
    return file_set


def _get_files_under_test(args, required_files, parser):
    """
    Helper for main to get files_under_test

    Exits the program if --cache-remote debugging option is used
    """
    if args.local:
        files_under_test = _retrieve_local_files(required_files, args.local)
    else: # --remote and --cache-remote
        files_under_test = _retrieve_remote_files(required_files)
        if args.cache_remote:
            for file_path, file_content in files_under_test.items():
                if not (args.cache_remote / file_path).parent.exists():
                    (args.cache_remote / file_path).parent.mkdir(parents=True)
                with (args.cache_remote / file_path).open('w', encoding=ENCODING) as cache_file:
                    cache_file.write('\n'.join(file_content))
            parser.exit()
    return files_under_test


def main():
    """CLI Entrypoint"""
    parser = argparse.ArgumentParser(description=__doc__)
    parser.add_argument(
        '-s',
        '--series',
        type=Path,
        metavar='FILE',
        default=str(Path('patches', 'series')),
        help='The series file listing patches to apply. Default: %(default)s')
    parser.add_argument(
        '-p',
        '--patches',
        type=Path,
        metavar='DIRECTORY',
        default='patches',
        help='The patches directory to read from. Default: %(default)s')
    add_common_params(parser)

    file_source_group = parser.add_mutually_exclusive_group(required=True)
    file_source_group.add_argument(
        '-l',
        '--local',
        type=Path,
        metavar='DIRECTORY',
        help=
        'Use a local source tree. It must be UNMODIFIED, otherwise the results will not be valid.')
    file_source_group.add_argument(
        '-r',
        '--remote',
        action='store_true',
        help=('Download the required source tree files from Google. '
              'This feature requires the Python module "requests". If you do not want to '
              'install this, consider using --local instead.'))
    file_source_group.add_argument(
        '-c',
        '--cache-remote',
        type=Path,
        metavar='DIRECTORY',
        help='(For debugging) Store the required remote files in an empty local directory')
    args = parser.parse_args()
    if args.cache_remote and not args.cache_remote.exists():
        if args.cache_remote.parent.exists():
            args.cache_remote.mkdir()
        else:
            parser.error('Parent of cache path {} does not exist'.format(args.cache_remote))

    if not args.series.is_file():
        parser.error('--series path is not a file or not found: {}'.format(args.series))
    if not args.patches.is_dir():
        parser.error('--patches path is not a directory or not found: {}'.format(args.patches))

    series_iterable = tuple(parse_series(args.series))
    had_failure, patch_cache = _load_all_patches(series_iterable, args.patches)
    required_files = _get_required_files(patch_cache)
    files_under_test = _get_files_under_test(args, required_files, parser)
    had_failure |= _test_patches(series_iterable, patch_cache, files_under_test)
    if had_failure:
        get_logger().error('***FAILED VALIDATION; SEE ABOVE***')
        if not args.verbose:
            get_logger().info('(For more error details, re-run with the "-v" flag)')
        parser.exit(status=1)
    else:
        get_logger().info('Passed validation (%d patches total)', len(series_iterable))


if __name__ == '__main__':
    main()


================================================
FILE: docs/building.md
================================================
# Building ungoogled-chromium

The recommended way to build ungoogled-chromium is by consulting [the repository for your supported platform (links here)](platforms.md).

* *Linux users*: If your distribution is not listed, you will need to use Portable Linux.

If you want to add ungoogled-chromium to your existing Chromium build process, see the next section. Additionally, you may reference the repositories for supported platforms for inspiration.

## Integrating ungoogled-chromium into your Chromium build process

Typically, ungoogled-chromium is built from [code in platform-specific repositories](platforms.md). However, ungoogled-chromium can also be included in part or in whole into any custom Chromium build. In this section, **we will assume you already have a process to make your own Chromium builds**.

Before continuing, you may find it helpful to have a look through [the design documentation](design.md).

The following procedure outline the essential steps to build Chromium will all of ungoogled-chromium's features. **They are not sufficient to build ungoogled-chromium on their own**.

1. Ensure Chromium is downloaded, such as by `depot_tools`. In most of our supported platforms, we instead use a custom tool to do this.

```sh
mkdir -p build/download_cache
./utils/downloads.py retrieve -c build/download_cache -i downloads.ini
./utils/downloads.py unpack -c build/download_cache -i downloads.ini -- build/src
```

2. Prune binaries

```sh
./utils/prune_binaries.py build/src pruning.list
```

3. Apply patches

```sh
./utils/patches.py apply build/src patches
```

4. Substitute domains

```sh
./utils/domain_substitution.py apply -r domain_regex.list -f domain_substitution.list -c build/domsubcache.tar.gz build/src
```

5. Build GN. If you are using `depot_tools` to checkout Chromium or you already have a GN binary, you should skip this step.

```sh
mkdir -p build/src/out/Default
cd build/src
./tools/gn/bootstrap/bootstrap.py --skip-generate-buildfiles -j4 -o out/Default/
```

6. Invoke the build:

```
mkdir -p build/src/out/Default
# NOTE: flags.gn contains only a subset of what is needed to run the build.
cp flags.gn build/src/out/Default/args.gn
cd build/src
# If you have additional GN flags to add, make sure to add them now.
./out/Default/gn gen out/Default --fail-on-unused-args
ninja -C out/Default chrome chromedriver chrome_sandbox
```

## Building FAQ

### My build keeps crashing because I run out of RAM! How can I fix it?

Here are several ways to address this, in decreasing order of preference:

1. Set the GN flag `jumbo_file_merge_limit` to a lower value. At the time of writing, Debian uses `8` (the default varies, but it can be a higher value like `50`)
2. Decrease the number of parallel threads to Ninja (the `-j` flag)
3. Add swap space


================================================
FILE: docs/contributing.md
================================================
# Contributing

This document contains our criteria and guidelines for contributing to ungoogled-chromium.

If you have **small contributions that don't fit our criteria**, consider adding them to [ungoogled-software/contrib](https://github.com/ungoogled-software/contrib) or [our Wiki](https://github.com/ungoogled-software/ungoogled-chromium-wiki) instead.

If you are a developer of an **officially-supported platform**, be sure to check out the [Platform Repository Standards and Guidelines](repo_management.md).

List of contents:

* [How to help](#how-to-help)
* [Submitting changes](#submitting-changes)
* [Criteria for new features](#criteria-for-new-features)

### How to help

Generally, ungoogled-chromium needs contributors to help:

* Keep up-to-date with the latest stable Chromium, and any problematic changes in the new version that needs modification.
* Help with issues marked with the `help wanted` tag (usually either questions for other users, or request for help from other developers)
* Review Pull Requests from other contributors
* Implement feature requests ("enhancements" in the Issue Tracker), large or small.
* Implement issues marked with the `backlog` tag (that are closed).
	* If it requires new code, please read through the [Submitting changes](#submitting-changes) section below.

In addition, anyone is free to help others in need of support in the Issue Tracker.

If there are fixes, tweaks, or additions you want to make, continue onto the following section.

### Submitting changes

Please submit all changes via Pull Requests.

Guidelines:

* You are welcome to submit minor changes, such as bug fixes, documentation fixes, and tweaks.
* If your change has an associated issue, please let others know that you are working on it.
* If you want to submit a new feature, please read through the [Criteria for new features](#criteria-for-new-features) below.
* When in doubt about the acceptance of a change, you are welcome to ask via an issue first.

### Criteria for new features

1. New features should not detract from the default Chromium experience, unless it falls under the project's main objectives (i.e. removing Google integration and enhancing privacy).

    * For larger features, please propose them via an issue first.

2. New features should live behind a setting that is **off by default**.

    * Settings are usually added via a command-line flag and `chrome://flags` entries. See [the relevant section in docs/developing.md](developing.md#adding-command-line-flags-and-chromeflags-options) for more information.
    * Unless there are significant benefits, adding the setting to `chrome://settings` is *not recommended* due to the additional maintenance required (caused by the infrastructure that backs preferences).

**NOTE**: In the event that the codebase changes significantly for a non-essential patch (i.e. a patch that does not contribute to the main objectives of ungoogled-chromium), it will be removed until someone updates it.


================================================
FILE: docs/design.md
================================================
# Design

This document contains a high-level technical description of ungoogled-chromium and its components.

## Overview

ungoogled-chromium consists of the following major components:

* [Configuration](#configuration)
    * [Configuration files](#configuration-files)
    * [Source file processors](#source-file-processors)
    * [Patches](#patches)
* [Packaging](#packaging)

The following sections describe each component.

## Configuration

Configuration is a broad term that refers to patches, build flags, and metadata about Chromium source code. It consists of the following components:

* [Configuration files](#configuration-files)
* [Source file processors](#source-file-processors)
* [Patches](#patches)

The following sections describe each component in more depth.

### Configuration Files

Configuration files (or config files) are files that store build configuration and source code changes for a build.

**IMPORTANT**: For consistency, all config files must be encoded in UTF-8.

List of configuration files:

* `chromium_version.txt` - The Chromium version used by ungoogled-chromium
* `revision.txt` - The revision of the changes on top of the given Chromium version.
* `pruning.list` - [See the Source File Processors section](#source-file-processors)
* `domain_regex.list` - [See the Source File Processors section](#source-file-processors)
* `domain_substitution.list` - [See the Source File Processors section](#source-file-processors)
* `downloads.ini` - Archives to download and unpack into the buildspace tree. This includes code not bundled in the Chromium source code archive that is specific to a non-Linux platform. On platforms such as macOS, this also includes a pre-built LLVM toolchain for convenience (which can be removed and built from source if desired).
* `flags.gn` - GN arguments to set before building.

### Source File Processors

Source file processors are utilities that directly manipulate the Chromium source tree before building. Currently, there are two such utilities: binary pruning, and domain substitution.

**Binary Pruning**: Strips binaries from the source code. This includes pre-built executables, shared libraries, and other forms of machine code. Most are substituted with system or user-provided equivalents, or are built from source; those binaries that cannot be removed do not contain machine code.

The list of files to remove are determined by the config file `pruning.list`. This config file is generated by `devutils/update_lists.py`.

**Domain Substitution**: Replaces Google and several other web domain names in the Chromium source code with non-existent alternatives ending in `qjz9zk`. These changes are mainly used as a backup measure to detect potentially unpatched requests to Google. Note that domain substitution is a crude process, and *may not be easily undone*.

With a few patches from ungoogled-chromium, any requests with these domain names sent via `net::URLRequest` in the Chromium code are blocked and notify the user via a info bar.

Similar to binary pruning, the list of files to modify are listed in `domain_substitution.list`; it is also updated with `devutils/update_lists.py`.

The regular expressions to use are listed in `domain_regex.list`; the search and replacement expressions are delimited with a pound (`#`) symbol. The restrictions for the entries are as follows:
* All replacement expressions must end in the TLD `qjz9zk`.
* The search and replacement expressions must have a one-to-one correspondence: no two search expressions can match the same string, and no two replacement expressions can result in the same string.

### Patches

All of ungoogled-chromium's patches for the Chromium source code are located in `patches/`. This directory conforms to the default GNU Quilt format. That is:

* All patches must reside inside `patches/`
* There is a `patches/series` text file that defines the order to apply all the patches. These patches are listed as a relative path from the `patches` directory.
    * Lines starting with the pound symbol (`#`) are ignored
    * For lines with patch paths: If there is a space followed by a pound symbol, the text after the patch path will be ignored.

All patch files in ungoogled-chromium must satisfy these formatting requirements:

* Patch filenames must end with the extension `.patch`
* The content must be in [unified format](https://en.wikipedia.org/wiki/Diff_utility#Unified_format).
* All paths in the hunk headers must begin after the first slash (which corresponds to the argument `-p1` for GNU patch).
* All patches must apply cleanly (i.e. no fuzz).
* It is recommended that hunk paths have the `a/` and `b/` prefixes, and a context of 3 (like the git default).
* All patches must be encoded in UTF-8 (i.e. same encoding as config files).

Patches are categorized into two directories directly under `patches/`:

1. **core**: Changes regarding background requests, code specific to Google web services, or code using pre-made binaries. They must be kept up-to-date with all of the changes in Chromium.
2. **extra**: Changes to features regarding control and transparency. They are not guaranteed to persist across updates to Chromium.

Within each category, patches are grouped by the following:

* `debian/` - Patches from Debian's Chromium
    * Patches are not modified unless they conflict with Inox's patches
    * These patches are not Debian-specific. For those, see the `debian/patches` directory
* `inox-patchset/` - Contains a modified subset of patches from Inox patchset.
    * Some patches such as those that change branding are omitted
    * Patches are not modified unless they do not apply cleanly onto the version of Chromium being built
    * Patches are from [inox-patchset's GitHub](//github.com/gcarq/inox-patchset)
    * [Inox patchset's license](//github.com/gcarq/inox-patchset/blob/master/LICENSE)
* `bromite/` - Patches from [Bromite](//github.com/bromite/bromite)
* `iridium-browser/` - Contains a modified subset of patches from Iridium Browser.
    * Some patches such as those that change branding or URLs to point to Iridium's own servers are omitted
    * Patches are not modified unless they conflict with Debian's or Inox's patches
    * Patches are from the `patchview` branch of Iridium's Git repository. [Git webview of the patchview branch](//git.iridiumbrowser.de/cgit.cgi/iridium-browser/?h=patchview)
* `opensuse/` - Patches from openSUSE's Chromium
* `ubuntu/` -  Patches from Ubuntu's Chromium
* `ungoogled-chromium/` - Patches by ungoogled-chromium developers

## Packaging

Packaging is the process of downloading, building, and producing a distributable package of ungoogled-chromium.

Packaging files use the code from this repository to build ungoogled-chromium. Each platform and configuration has an associated packaging repository under the [ungoogled-software](//github.com/ungoogled-software) organization. For more information about each packaging repository, see the [building documentation](building.md).

Packaging generally consists of the major steps:

1. Download and unpack the source tree
2. Prune binaries
3. Apply patches
4. Substitute domains
5. Build GN via `tools/gn/bootstrap/bootstrap.py`
6. Run `gn gen` with the GN flags
7. Build Chromium via `ninja`
8. Create package(s) of build output (usually in `out/Default`)


================================================
FILE: docs/developing.md
================================================
# Development notes and procedures

This document contains an assortment of information for those who want to develop ungoogled-chromium.

Information targeted towards developers *and* other users live in [the Wiki](//ungoogled-software.github.io/ungoogled-chromium-wiki/).

Contents:

* [Branches](#branches)
* [Adding command-line flags and chrome://flags options](#adding-command-line-flags-and-chromeflags-options)
* [Workflow of updating to a new Chromium version](#workflow-of-updating-to-a-new-chromium-version)

## Branches

Development is focused on `master`, and any changes in there should not break anything unless platforms break during a Chromium version rebase.

Larger feature changes or hotfixes must be done in a separate branch. Once they are ready, then a Pull Request can be made onto `master` (for contributors with write access, merging directly via a git client is fine). After the branch is merged, it should be removed.

## Adding command-line flags and `chrome://flags` options

See `docs/how_to_add_your_feature_flag.md` in the Chromium source tree for the steps needed. Note that updating `tools/metrics/histograms/enums.xml` is not required.

For new flags, first add a constant to `third_party/ungoogled/ungoogled_switches.cc` (by modifying patch `resources/patches/ungoogled-chromium/add-third-party-ungoogled.patch`). Then, use this constant in the steps outlined above.

## Workflow of updating to a new Chromium version

Tested on Debian 10 (buster). Exact instructions should work on any other Linux or macOS system with the proper dependencies.

To gain a deeper understanding of this updating process, have a read through [docs/design.md](design.md).

### Dependencies

* [`quilt`](http://savannah.nongnu.org/projects/quilt)
    * This is available in most (if not all) Linux distributions, and also Homebrew on macOS.
    * This utility facilitates most of the updating process, so it is important to learn how to use this. The manpage for quilt (as of early 2017) lacks an example of a workflow. There are multiple guides online, but [this guide from Debian](https://wiki.debian.org/UsingQuilt) and [the referenced guide on that page](https://raphaelhertzog.com/2012/08/08/how-to-use-quilt-to-manage-patches-in-debian-packages/) are the ones referenced in developing the current workflow.
* Python 3.6 or newer

### Downloading the source code

```sh
mkdir -p build/download_cache
./utils/downloads.py retrieve -i downloads.ini -c build/download_cache
./utils/downloads.py unpack -i downloads.ini -c build/download_cache build/src
```

### Updating lists

The utility `devutils/update_lists.py` automates this process. By default, it will update the files in the local repo. Pass in `-h` or `--help` for available options.

```sh
./devutils/update_lists.py -t build/src
```

The resulting source tree in `build/src` *will not* have binaries pruned or domains substituted.

### Updating patches

**IMPORTANT**: Make sure domain substitution has not been applied before updating patches.

1. Run `source devutils/set_quilt_vars.sh`
    * This will setup quilt to modify patches directly in `patches/`
2. Go into the source tree: `cd build/src`
3. Use `quilt` to refresh all patches: `while quilt push; do quilt refresh; done`
	* If an error occurs, go to the next step. Otherwise, skip to Step 5.
4. Use `quilt` to fix the broken patch:
    1. Run `quilt push -f`
    2. Edit the broken files as necessary by adding (`quilt edit ...` or `quilt add ...`) or removing (`quilt remove ...`) files as necessary
        * When removing large chunks of code, remove each line instead of using language features to hide or remove the code. This makes the patches less susceptible to breakages when using quilt's refresh command (e.g. quilt refresh updates the line numbers based on the patch context, so it's possible for new but desirable code in the middle of the block comment to be excluded.). It also helps with readability when someone wants to see the changes made based on the patch alone.
    3. Refresh the patch: `quilt refresh`
    4. Go back to Step 3.
5. Run `quilt pop -a`
6. Go back to ungoogled-chromium repo: `cd ../..`
7. Run `devutils/validate_config.py`. If any warnings are printed, address them; otherwise, continue to Step 8.
8. Run `devutils/validate_patches.py -l build/src`. If errors occur, go back to Step 3.

This should leave unstaged changes in the git repository to be reviewed, added, and committed.

### Steps for fixing patches after a failed build attempt

If domain substitution is not applied, then the steps from the previous section will work for revising patches.

If domain substitution is applied, then the steps for the initial update will not apply since that would create patches which depend on domain substitution. Here is a method of dealing with this:

1. Revert domain substitution: `./utils/domain_substitution.py revert -c CACHE_PATH_HERE build/src`
2. Follow the patch updating section above
3. Reapply domain substitution: `./utils/domain_substitution.py apply -r domain_regex.list -f domain_substitution.list -c CACHE_PATH_HERE build/src`
4. Reattempt build. Repeat steps as necessary.

### Next steps

* Submit a Pull Request of these changes to the ungoogled-chromium repo.
* Once the PR is merged, update the repositories of each platform repository that you are maintaining under the `ungoogled-software` organization.


================================================
FILE: docs/flags.md
================================================
# List of flags and switches

This is an exhaustive list of command-line switches and `chrome://flags` introduced by ungoogled-chromium

**NOTE**: If you add a command-line argument that is also in `chrome://flags`, the flag's state will not be indicated in `chrome://flags`. There is no universal way to ensure command-line flags are taking effect, but you can find if they're being seen by checking `chrome://version`.

If a flag requires a value, you must specify it with an `=` sign; e.g. flag `--foo` with value `bar` should be written as `--foo=bar`.

* `--disable-beforeunload` (Not in `chrome://flags`) - Disables JavaScript dialog boxes triggered by `beforeunload`
* `--disable-encryption` (Windows only, not in `chrome://flags`) - Disable encryption of cookies, passwords, and settings which uses a generated machine-specific encryption key. This is used to enable portable user data directories.
* `--disable-machine-id` (Windows only, not in `chrome://flags`) - Disables use of a generated machine-specific ID to lock the user data directory to that machine. This is used to enable portable user data directories.
* `--disable-search-engine-collection` - Disable automatic search engine scraping from webpages.
* `--enable-stacked-tab-strip` and `--enable-tab-adjust-layout` - These flags adjust the tab strip behavior. `--enable-stacked-tab-strip` is also configurable in `chrome://flags` Please note that they are not well tested, so proceed with caution.
* `--extension-mime-request-handling` - Change how extension MIME types (CRX and user scripts) are handled. Acceptable values are `download-as-regular-file` or `always-prompt-for-install`. Leave unset to use normal behavior. It is also configurable under `chrome://flags`
* `--fingerprinting-canvas-image-data-noise` (Added flag to Bromite feature) - Implements fingerprinting deception for Canvas image data retrieved via JS APIs. In the data, at most 10 pixels are slightly modified.
* `--fingerprinting-canvas-measuretext-noise` (Added flag to Bromite feature) - Scale the output values of Canvas::measureText() with a randomly selected factor in the range -0.0003% to 0.0003%, which are recomputed on every document initialization.
* `--fingerprinting-client-rects-noise` (Added flag to Bromite feature) - Implements fingerprinting deception of JS APIs `getClientRects()` and `getBoundingClientRect()` by scaling their output values with a random factor in the range -0.0003% to 0.0003%, which are recomputed for every document instantiation.
* `--hide-crashed-bubble` (Not in `chrome://flags`) - Hides the bubble box with the message "Restore Pages? Chromium didn't shut down correctly." that shows on startup after the browser did not exit cleanly.
* `--max-connections-per-host` (from Bromite) - Configure the maximum allowed connections per host. Valid values are `6` and `15`
* `--pdf-plugin-name` - Sets the internal PDF viewer plugin name. Useful for sites that probe JavaScript API `navigator.plugins`. Supports values `chrome` for Chrome, `edge` for Microsoft Edge. Default value when omitted is Chromium.
* `--scroll-tabs` - Determines if scrolling will cause a switch to a neighboring tab if the cursor hovers over the tabs, or the empty space beside the tabs. The flag requires one the values: `always`, `never`, `incognito-and-guest`. When omitted, the default is to use platform-specific behavior, which is currently enabled only on desktop Linux.
* `--set-ipv6-probe-false` - (Not in `chrome://flags`) Forces the result of the browser's IPv6 probing (i.e. IPv6 connectivity test) to be unsuccessful. This causes IPv4 addresses to be prioritized over IPv6 addresses. Without this flag, the probing result is set to be successful, which causes IPv6 to be used over IPv4 when possible.
* `--show-avatar-button` - Sets visibility of the avatar button. The flag requires one of the values: `always`, `incognito-and-guest` (only show Incognito or Guest modes), or `never`.


================================================
FILE: docs/platforms.md
================================================
# Supported Platforms

This page lists platforms officially supported by ungoogled-chromium, and their associated repositories.

* Android: [ungoogled-chromium-android](//github.com/ungoogled-software/ungoogled-chromium-android)
* Arch Linux: [ungoogled-chromium-archlinux](//github.com/ungoogled-software/ungoogled-chromium-archlinux)
* Debian, Ubuntu, and derivatives: [ungoogled-chromium-debian](//github.com/ungoogled-software/ungoogled-chromium-debian)
* Portable Linux (for any Linux distro): [ungoogled-chromium-portablelinux](//github.com/ungoogled-software/ungoogled-chromium-portablelinux)
* Windows: [ungoogled-chromium-windows](//github.com/ungoogled-software/ungoogled-chromium-windows)
* macOS: [ungoogled-chromium-macos](//github.com/ungoogled-software/ungoogled-chromium-macos)


================================================
FILE: docs/repo_management.md
================================================
# Platform Repository Standards and Guidelines

*This document is new, and its structure and content may change. If you have suggestions, please create an issue!*

ungoogled-chromium is comprised of anonymous developers who volunteer their efforts. Some of these developers may choose to provide long-term support for [an officially-supported platform](platforms.md), or bring support to a new platform. For such developers, this document consists of standards and management guidelines for platform repos.

We will refer to this git repository as "the main repo", and refer to repositories that add platform-specific code to build ungoogled-chromium as "platform repos". An "officially-supported platform" is a platform with a platform repo in [the ungoogled-software organization](//github.com/ungoogled-software) and noted in [docs/platforms.md](platforms.md).

## Standards

An officially-supported platform repo:

* Must not modify or remove existing patches, GN flags, domain substitution, or binary pruning in the main repo. Instead, you can add new patches or add more files/rules to domain substitution or binary pruning. (If you think a change is needed in the main repo, please make an issue!)
* Must have a tagging/versioning scheme that includes the ungoogled-chromium version.
* Must not require an Internet connection during compilation (before compilation is OK).
* Should allow the user to download all build requirements before building.
* Must not require external services to build, aside from repos in the ungoogled-software organization and repos provided by or used by Chromium.
* Should have a reproducible build for all versions (currently, there is no formal process to enforce/verify reproducibility of binaries)

Each deviation must be clearly noted in the platform repo's documentation (such as the repo's README), and have an associated issue in the platform repo. 

## Teams in the ungoogled-software organization

Each officially-supported platform has one or more teams in the ungoogled-software organization. These teams provide additional means for collaborating with other developers, such as issue triaging and private discussions (see section "How to communicate" below).

If you are a regular contributor and would like to provide long-term support for a platform, you can request to be included in the ungoogled-software organization team for your platform. Since the number of developers is low, there is no formal process to do this; just ask in an issue.

## How to communicate

In the interest of transparency, it is recommended to discuss work in public spaces like issues or PRs. If a discussion should not involve outsiders, you can lock the issue or PR to collaborators only.

You must use team discussions if you are discussing or sharing information that can affect the security of the repository. Otherwise, you may use team discussions at your discretion.

## Issues

Each platform repo should have a team in ungoogled-software with the Triage permission level. All members should feel free to manage issues.

TODO: More details?

## Pull Requests

TODO

## Repository Settings and Shared Resources

Shared resources includes:

* CI services like CirrusCI, GitHub Actions, etc.
* Build services like OpenSUSE Build Service (OBS)

These need to be handled with care, as they can cause a wide variety of issues from security and privacy leaks all the way to data loss.

There are several ways to handle shared resources:

* Assign one person to manage a certain set of settings (i.e. grant them "ownership" of those settings). If you want to change a setting, you should request a change in a team discussion.
* TODO: More ways to manage settings?


================================================
FILE: domain_regex.list
================================================
fonts(\\*?)\.googleapis(\\*?)\.com#f0ntz\g<1>.9oo91e8p1\g<2>.qjz9zk
google([A-Za-z\-]*?\\*?)\.com(?!mon)#9oo91e\g<1>.qjz9zk
gstatic([A-Za-z\-]*?\\*?)\.com#95tat1c\g<1>.qjz9zk
chrome([A-Za-z\-]*?\\*?)\.com#ch40me\g<1>.qjz9zk
chromium([A-Za-z\-]*?\\*?)\.org#ch40m1um\g<1>.qjz9zk
mozilla([A-Za-z\-]*?\\*?)\.org#m0z111a\g<1>.qjz9zk
facebook([A-Za-z\-]*?\\*?)\.com#f8c3b00k\g<1>.qjz9zk
appspot([A-Za-z\-]*?\\*?)\.com#8pp2p8t\g<1>.qjz9zk
youtube([A-Za-z\-]*?\\*?)\.com#y0u1ub3\g<1>.qjz9zk
ytimg([A-Za-z\-]*?\\*?)\.com#yt1mg\g<1>.qjz9zk
gmail([A-Za-z\-]*?\\*?)\.com#9ma1l\g<1>.qjz9zk
doubleclick([A-Za-z\-]*?\\*?)\.net#60u613cl1c4\g<1>.n3t.qjz9zk
doubleclick([A-Za-z\-]*?\\*?)\.com#60u613cl1c4\g<1>.c0m.qjz9zk
googlezip(\\*?)\.net#9oo91e21p\g<1>.qjz9zk
beacons([1-9]?\\*?)\.gvt([1-9]?\\*?)\.com#b3ac0n2\g<1>.9vt\g<2>.qjz9zk
ggpht(\\*?)\.com#99pht\g<1>.qjz9zk
microsoft(\\*?)\.com#m1cr050ft\g<1>.qjz9zk
1e100(\\*?)\.net#l3lOO\g<1>.qjz9zk
(?<!http://schemas.)android(\\*?)\.com#8n6r01d\g<1>.qjz9zk
goo\.gl#goo.gl.qjz9zk


================================================
FILE: domain_substitution.list
================================================
.gn
BUILD.gn
PRESUBMIT.py
PRESUBMIT_test.py
android_webview/browser/aw_browser_context.cc
android_webview/browser/aw_content_browser_client.h
android_webview/browser/aw_contents_io_thread_client.cc
android_webview/browser/aw_permission_manager_unittest.cc
android_webview/browser/aw_settings.cc
android_webview/browser/network_service/aw_web_resource_request.h
android_webview/browser/permission/media_access_permission_request_unittest.cc
android_webview/browser/permission/permission_request_handler_unittest.cc
android_webview/browser/renderer_host/auto_login_parser_unittest.cc
android_webview/browser/safe_browsing/aw_safe_browsing_whitelist_manager.cc
android_webview/browser/safe_browsing/aw_safe_browsing_whitelist_manager.h
android_webview/browser/safe_browsing/aw_safe_browsing_whitelist_manager_unittest.cc
android_webview/common/url_constants.cc
android_webview/lib/aw_main_delegate.cc
android_webview/nonembedded/java/res_devui/values/strings.xml
android_webview/tools/cts_config/webview_cts_gcs_path.json
android_webview/tools/cts_utils.py
android_webview/tools/record_netlog.py
android_webview/tools/remove_preinstalled_webview.py
android_webview/tools/update_cts.py
ash/app_list/PRESUBMIT.py
ash/app_list/views/app_list_view_unittest.cc
ash/app_list/views/assistant/privacy_info_view.cc
ash/app_list/views/search_result_answer_card_view_unittest.cc
ash/ash_strings.grd
ash/assistant/assistant_interaction_controller.cc
ash/assistant/assistant_setup_controller.cc
ash/assistant/util/deep_link_util.cc
ash/assistant/util/deep_link_util_unittest.cc
ash/fast_ink/laser/laser_pointer_view.cc
ash/login/parent_access_controller_unittest.cc
ash/login/ui/login_user_menu_view.cc
ash/login/ui/login_user_menu_view_unittest.cc
ash/perftests/overview_animations_preftest.cc
ash/public/cpp/android_intent_helper_unittest.cc
ash/public/cpp/app_list/app_list_features.cc
ash/public/cpp/app_list/internal_app_id_constants.h
ash/resources/PRESUBMIT.py
ash/shell/content/client/shell_browser_main_parts.cc
ash/shell/content/client/shell_new_window_delegate.cc
ash/system/unified/user_chooser_detailed_view_controller_unittest.cc
ash/wm/screen_pinning_controller.h
base/BUILD.gn
base/PRESUBMIT.py
base/allocator/BUILD.gn
base/allocator/partition_allocator/address_space_randomization.h
base/android/android_image_reader_abi.h
base/android/jni_generator/PRESUBMIT.py
base/android/jni_generator/jni_generator.py
base/android/library_loader/library_loader_hooks.h
base/atomicops.h
base/base_paths_win.cc
base/command_line.cc
base/debug/debugger_posix.cc
base/debug/proc_maps_linux.cc
base/debug/stack_trace_fuchsia.cc
base/debug/stack_trace_posix.cc
base/debug/stack_trace_win.cc
base/file_version_info.h
base/files/file_enumerator_posix.cc
base/files/file_unittest.cc
base/files/file_util_posix.cc
base/files/file_util_win.cc
base/files/file_win.cc
base/hash/hash.cc
base/i18n/break_iterator_unittest.cc
base/i18n/file_util_icu.cc
base/i18n/rtl_unittest.cc
base/i18n/timezone_unittest.cc
base/ios/device_util.mm
base/lazy_instance_helpers.h
base/logging.cc
base/mac/close_nocancel.cc
base/mac/objc_release_properties_unittest.mm
base/memory/aligned_memory.cc
base/memory/discardable_shared_memory.cc
base/memory/scoped_refptr.h
base/memory/shared_memory_mapping_unittest.cc
base/memory/shared_memory_security_policy.cc
base/message_loop/message_loop_unittest.cc
base/message_loop/message_pump_win.cc
base/metrics/field_trial.h
base/metrics/histogram_functions.h
base/metrics/histogram_macros.h
base/metrics/user_metrics.h
base/native_library_win.cc
base/optional.h
base/process/launch.h
base/process/launch_posix.cc
base/process/memory.cc
base/process/memory.h
base/process/process.h
base/process/process_metrics.h
base/process/process_unittest.cc
base/profiler/metadata_recorder.h
base/rand_util_win.cc
base/security_unittest.cc
base/strings/pattern_unittest.cc
base/strings/string_number_conversions_unittest.cc
base/synchronization/lock.h
base/system/sys_info.h
base/task/sequence_manager/sequence_manager_impl.cc
base/task/task_traits.h
base/task/thread_pool/thread_group_native_win.h
base/test/launcher/test_launcher.cc
base/test/sequenced_task_runner_test_template.h
base/test/test_file_util.h
base/test/test_file_util_win.cc
base/test/test_suite.cc
base/third_party/cityhash/city.h
base/third_party/dynamic_annotations/dynamic_annotations.h
base/third_party/libevent/evdns.c
base/third_party/libevent/evdns.h
base/third_party/libevent/evport.c
base/third_party/libevent/min_heap.h
base/third_party/nspr/prtime.cc
base/third_party/nspr/prtime.h
base/third_party/symbolize/symbolize.cc
base/threading/platform_thread_unittest.cc
base/threading/platform_thread_win.cc
base/time/time.h
base/time/time_win.cc
base/trace_event/cfi_backtrace_android_unittest.cc
base/trace_event/heap_profiler_allocation_context.h
base/trace_event/malloc_dump_provider.cc
base/trace_event/process_memory_dump.h
base/trace_event/trace_category.h
base/values.h
base/values_unittest.cc
base/win/pe_image.h
base/win/registry.cc
base/win/registry.h
base/win/registry_unittest.cc
base/win/scoped_com_initializer.cc
base/win/shortcut.h
base/win/win_util.cc
base/win/win_util.h
base/win/wincrypt_shim.h
base/win/windows_version.h
base/win/wmi.cc
base/win/wmi.h
build/android/PRESUBMIT.py
build/android/apk_operations.py
build/android/dump_apk_resource_strings.py
build/android/gradle/generate_gradle.py
build/android/gyp/assert_static_initializers.py
build/android/gyp/compile_java.py
build/android/gyp/compile_resources.py
build/android/gyp/lint.py
build/android/gyp/main_dex_list.py
build/android/gyp/merge_manifest.py
build/android/gyp/proguard.py
build/android/gyp/util/diff_utils.py
build/android/gyp/util/resources_parser.py
build/android/incremental_install/installer.py
build/android/lint/suppressions.xml
build/android/pylib/constants/__init__.py
build/android/pylib/device_settings.py
build/android/pylib/dex/dex_parser.py
build/android/pylib/instrumentation/instrumentation_parser.py
build/android/pylib/instrumentation/instrumentation_test_instance.py
build/android/pylib/instrumentation/render_test.html.jinja
build/android/pylib/local/device/local_device_gtest_run.py
build/android/pylib/local/device/local_device_monkey_test_run.py
build/android/pylib/local/emulator/avd.py
build/android/pylib/results/flakiness_dashboard/json_results_generator.py
build/android/pylib/results/presentation/test_results_presentation.py
build/android/pylib/utils/google_storage_helper.py
build/android/pylib/utils/maven_downloader.py
build/android/pylib/utils/simpleperf.py
build/android/resource_sizes.py
build/android/test_runner.py
build/android/test_wrapper/logdog_wrapper.py
build/build_config.h
build/chromeos/PRESUBMIT.py
build/chromeos/test_runner.py
build/config/BUILDCONFIG.gn
build/config/android/rules.gni
build/config/chrome_build.gni
build/config/chromeos/args.gni
build/config/chromeos/rules.gni
build/config/compiler/BUILD.gn
build/config/compiler/compiler.gni
build/config/fuchsia/BUILD.gn
build/config/fuchsia/elfinfo.py
build/config/nacl/BUILD.gn
build/config/nacl/rules.gni
build/config/sanitizers/sanitizers.gni
build/config/win/BUILD.gn
build/find_isolated_tests.py
build/linux/install-chromeos-fonts.py
build/linux/sysroot_scripts/install-sysroot.py
build/linux/unbundle/remove_bundled_libraries.py
build/mac/tweak_info_plist.py
build/mac_toolchain.py
build/nocompile.gni
build/package_mac_toolchain.py
build/run_swarming_xcode_install.py
build/sanitizers/lsan_suppressions.cc
build/sanitizers/tsan_suppressions.cc
build/toolchain/cros_toolchain.gni
build/toolchain/nacl/BUILD.gn
build/toolchain/win/midl.py
build/toolchain/win/rc/rc.py
build/util/lib/common/perf_tests_results_helper.py
build/vs_toolchain.py
build/whitespace_file.txt
buildtools/README.txt
buildtools/clang_format/README.txt
buildtools/ensure_gn_version.py
buildtools/third_party/libc++/trunk/CREDITS.TXT
buildtools/third_party/libc++/trunk/src/chrono.cpp
buildtools/third_party/libc++/trunk/utils/google-benchmark/src/cycleclock.h
buildtools/third_party/libc++abi/trunk/CREDITS.TXT
cc/PRESUBMIT.py
cc/animation/animation_delegate.h
cc/input/browser_controls_offset_manager.cc
cc/input/scroll_state.h
cc/input/scrollbar_controller.h
cc/paint/paint_image.h
cc/tiles/gpu_image_decode_cache.h
cc/trees/layer_tree_host_impl.cc
chrome/BUILD.gn
chrome/PRESUBMIT.py
chrome/android/BUILD.gn
chrome/android/examples/partner_browser_customizations_provider/res/values/strings.xml
chrome/android/features/autofill_assistant/java/strings/android_chrome_autofill_assistant_strings.grd
chrome/android/features/cablev2_authenticator/internal/BUILD.gn
chrome/android/java/res/layout/content_suggestions_card_modern_reversed.xml
chrome/android/java/res/values-sw600dp/values.xml
chrome/android/java/res/values/dimens.xml
chrome/android/java/res/values/strings.xml
chrome/android/java/res/values/values.xml
chrome/android/java/src/PRESUBMIT.py
chrome/android/javatests/AndroidManifest.xml
chrome/android/javatests/AndroidManifest_monochrome.xml
chrome/android/webapk/PRESUBMIT.py
chrome/android/webapk/shell_apk/manifest/maps_go_manifest_config.json
chrome/android/webapk/shell_apk/res/values/dimens.xml
chrome/app/PRESUBMIT.py
chrome/app/chrome_command_ids.h
chrome/app/chromium_strings.grd
chrome/app/generated_resources.grd
chrome/app/google_chrome_strings.grd
chrome/app/resources/locale_settings.grd
chrome/app/theme/PRESUBMIT.py
chrome/browser/about_flags.cc
chrome/browser/android/autofill_assistant/client_android.cc
chrome/browser/android/color_helpers_unittest.cc
chrome/browser/android/contextualsearch/contextual_search_context.h
chrome/browser/android/contextualsearch/contextual_search_delegate_unittest.cc
chrome/browser/android/customtabs/custom_tabs_browsertest.cc
chrome/browser/android/customtabs/detached_resource_request_unittest.cc
chrome/browser/android/digital_asset_links/digital_asset_links_handler.cc
chrome/browser/android/digital_asset_links/digital_asset_links_handler.h
chrome/browser/android/explore_sites/blacklist_site_task_unittest.cc
chrome/browser/android/explore_sites/clear_activities_task_unittest.cc
chrome/browser/android/explore_sites/explore_sites_fetcher_unittest.cc
chrome/browser/android/explore_sites/history_statistics_reporter_unittest.cc
chrome/browser/android/explore_sites/import_catalog_task_unittest.cc
chrome/browser/android/explore_sites/ntp_json_fetcher_unittest.cc
chrome/browser/android/explore_sites/record_site_click_task_unittest.cc
chrome/browser/android/explore_sites/url_util.cc
chrome/browser/android/explore_sites/url_util_experimental.cc
chrome/browser/android/history_report/delta_file_commons_unittest.cc
chrome/browser/android/search_permissions/search_permissions_service.h
chrome/browser/android/search_permissions/search_permissions_service_unittest.cc
chrome/browser/android/shortcut_info.cc
chrome/browser/android/signin/signin_manager_android_unittest.cc
chrome/browser/android/tab_android.cc
chrome/browser/android/tab_state.cc
chrome/browser/android/vr/PRESUBMIT.py
chrome/browser/android/vr/arcore_device/arcore_impl.cc
chrome/browser/android/vr/arcore_device/arcore_install_helper.cc
chrome/browser/android/webapk/webapk_icon_hasher_unittest.cc
chrome/browser/apps/app_service/app_icon_factory.cc
chrome/browser/apps/app_service/app_service_metrics.cc
chrome/browser/apps/guest_view/web_view_browsertest.cc
chrome/browser/apps/intent_helper/apps_navigation_throttle_unittest.cc
chrome/browser/apps/intent_helper/intent_picker_auto_display_service_unittest.cc
chrome/browser/apps/platform_apps/app_browsertest.cc
chrome/browser/apps/platform_apps/app_window_browsertest.cc
chrome/browser/apps/platform_apps/install_chrome_app.cc
chrome/browser/autocomplete/autocomplete_browsertest.cc
chrome/browser/autocomplete/chrome_autocomplete_provider_client_unittest.cc
chrome/browser/autocomplete/chrome_autocomplete_scheme_classifier_unittest.cc
chrome/browser/autocomplete/search_provider_unittest.cc
chrome/browser/autofill/autofill_browsertest.cc
chrome/browser/autofill/autofill_captured_sites_interactive_uitest.cc
chrome/browser/autofill/autofill_gstatic_reader.cc
chrome/browser/autofill/autofill_interactive_uitest.cc
chrome/browser/autofill/autofill_server_browsertest.cc
chrome/browser/autofill/automated_tests/cache_replayer.cc
chrome/browser/autofill/automated_tests/cache_replayer.h
chrome/browser/autofill/automated_tests/cache_replayer_unittest.cc
chrome/browser/autofill/captured_sites_test_utils.cc
chrome/browser/banners/app_banner_settings_helper_unittest.cc
chrome/browser/bluetooth/web_bluetooth_browsertest.cc
chrome/browser/bookmarks/managed_bookmark_service_unittest.cc
chrome/browser/browser_about_handler_unittest.cc
chrome/browser/browser_commands_unittest.cc
chrome/browser/browser_switcher/browser_switcher_browsertest.cc
chrome/browser/browser_switcher/browser_switcher_service.cc
chrome/browser/browser_switcher/browser_switcher_service_browsertest.cc
chrome/browser/browser_switcher/browser_switcher_sitelist_unittest.cc
chrome/browser/browser_switcher/ieem_sitelist_parser.cc
chrome/browser/browser_switcher/ieem_sitelist_parser_browsertest.cc
chrome/browser/browsing_data/browsing_data_cookie_helper_unittest.cc
chrome/browser/browsing_data/browsing_data_local_storage_helper_browsertest.cc
chrome/browser/browsing_data/browsing_data_remover_browsertest.cc
chrome/browser/browsing_data/chrome_browsing_data_remover_delegate_unittest.cc
chrome/browser/browsing_data/cookies_tree_model.cc
chrome/browser/browsing_data/counters/bookmark_counter_unittest.cc
chrome/browser/browsing_data/counters/browsing_data_counter_utils_browsertest.cc
chrome/browser/browsing_data/counters/history_counter_browsertest.cc
chrome/browser/browsing_data/counters/passwords_counter_browsertest.cc
chrome/browser/browsing_data/counters/site_data_counting_helper_unittest.cc
chrome/browser/browsing_data/counters/site_settings_counter_unittest.cc
chrome/browser/chrome_browser_application_mac.mm
chrome/browser/chrome_content_browser_client.cc
chrome/browser/chrome_content_browser_client_browsertest.cc
chrome/browser/chrome_content_browser_client_unittest.cc
chrome/browser/chrome_navigation_browsertest.cc
chrome/browser/chromeos/accessibility/accessibility_manager_browsertest.cc
chrome/browser/chromeos/accessibility/select_to_speak_live_site_browsertest.cc
chrome/browser/chromeos/android_sms/android_sms_urls.cc
chrome/browser/chromeos/android_sms/android_sms_urls.h
chrome/browser/chromeos/app_mode/fake_cws.cc
chrome/browser/chromeos/apps/apk_web_app_installer_browsertest.cc
chrome/browser/chromeos/apps/apk_web_app_installer_unittest.cc
chrome/browser/chromeos/arc/accessibility/arc_accessibility_util.cc
chrome/browser/chromeos/arc/arc_util_unittest.cc
chrome/browser/chromeos/arc/auth/arc_auth_service_browsertest.cc
chrome/browser/chromeos/arc/auth/arc_background_auth_code_fetcher.cc
chrome/browser/chromeos/arc/auth/arc_robot_auth_code_fetcher.cc
chrome/browser/chromeos/arc/bluetooth/arc_bluetooth_bridge.cc
chrome/browser/chromeos/arc/bluetooth/arc_bluetooth_bridge.h
chrome/browser/chromeos/arc/file_system_watcher/file_system_scanner.h
chrome/browser/chromeos/arc/intent_helper/arc_external_protocol_dialog_unittest.cc
chrome/browser/chromeos/arc/policy/arc_policy_bridge_unittest.cc
chrome/browser/chromeos/arc/policy/arc_policy_util.h
chrome/browser/chromeos/arc/session/arc_play_store_enabled_preference_handler_unittest.cc
chrome/browser/chromeos/arc/session/arc_session_manager_browsertest.cc
chrome/browser/chromeos/arc/session/arc_session_manager_unittest.cc
chrome/browser/chromeos/arc/tracing/arc_app_performance_tracing.cc
chrome/browser/chromeos/assistant/assistant_util.cc
chrome/browser/chromeos/assistant/assistant_util_unittest.cc
chrome/browser/chromeos/attestation/attestation_ca_client.cc
chrome/browser/chromeos/attestation/attestation_ca_client_unittest.cc
chrome/browser/chromeos/attestation/platform_verification_flow_unittest.cc
chrome/browser/chromeos/attestation/tpm_challenge_key_unittest.cc
chrome/browser/chromeos/backdrop_wallpaper_handlers/backdrop_wallpaper_handlers.cc
chrome/browser/chromeos/bluetooth/debug_logs_manager_unittest.cc
chrome/browser/chromeos/cert_provisioning/cert_provisioning_worker_unittest.cc
chrome/browser/chromeos/child_accounts/parent_access_code/parent_access_service_browsertest.cc
chrome/browser/chromeos/child_accounts/time_limits/web_time_limit_error_page/resources/web_time_limit_error_page.html
chrome/browser/chromeos/chrome_content_browser_client_chromeos_part_unittest.cc
chrome/browser/chromeos/crostini/crostini_util.h
chrome/browser/chromeos/customization/customization_document.cc
chrome/browser/chromeos/dbus/proxy_resolution_service_provider.h
chrome/browser/chromeos/dbus/proxy_resolution_service_provider_browsertest.cc
chrome/browser/chromeos/dbus/proxy_resolution_service_provider_unittest.cc
chrome/browser/chromeos/drive/drivefs_test_support.cc
chrome/browser/chromeos/extensions/default_web_app_ids.h
chrome/browser/chromeos/extensions/device_local_account_management_policy_provider.cc
chrome/browser/chromeos/extensions/file_manager/private_api_drive.cc
chrome/browser/chromeos/extensions/file_manager/private_api_misc.cc
chrome/browser/chromeos/extensions/permissions_updater_delegate_chromeos_unittest.cc
chrome/browser/chromeos/extensions/printing/printing_api_utils.h
chrome/browser/chromeos/extensions/quick_unlock_private/quick_unlock_private_api_unittest.cc
chrome/browser/chromeos/extensions/users_private/users_private_apitest.cc
chrome/browser/chromeos/file_manager/file_browser_handlers.h
chrome/browser/chromeos/file_manager/file_manager_string_util.cc
chrome/browser/chromeos/file_manager/file_tasks.h
chrome/browser/chromeos/file_manager/path_util_unittest.cc
chrome/browser/chromeos/file_system_provider/fileapi/provider_async_file_util.h
chrome/browser/chromeos/first_run/drive_first_run_controller.cc
chrome/browser/chromeos/first_run/goodies_displayer.cc
chrome/browser/chromeos/hats/hats_dialog.cc
chrome/browser/chromeos/kerberos/kerberos_credentials_manager_test.cc
chrome/browser/chromeos/logging_browsertest.cc
chrome/browser/chromeos/login/arc_terms_of_service_browsertest.cc
chrome/browser/chromeos/login/easy_unlock/easy_unlock_key_names.cc
chrome/browser/chromeos/login/easy_unlock/easy_unlock_screenlock_state_handler_unittest.cc
chrome/browser/chromeos/login/encryption_migration_browsertest.cc
chrome/browser/chromeos/login/error_screen_browsertest.cc
chrome/browser/chromeos/login/existing_user_controller_browsertest.cc
chrome/browser/chromeos/login/help_app_launcher.h
chrome/browser/chromeos/login/kiosk_browsertest.cc
chrome/browser/chromeos/login/login_browsertest.cc
chrome/browser/chromeos/login/login_ui_hide_supervised_users_browsertest.cc
chrome/browser/chromeos/login/login_ui_keyboard_browsertest.cc
chrome/browser/chromeos/login/login_ui_shelf_visibility_browsertest.cc
chrome/browser/chromeos/login/marketing_backend_connector.cc
chrome/browser/chromeos/login/password_change_browsertest.cc
chrome/browser/chromeos/login/profile_auth_data_unittest.cc
chrome/browser/chromeos/login/proxy_auth_dialog_browsertest.cc
chrome/browser/chromeos/login/quick_unlock/pin_migration_browsertest.cc
chrome/browser/chromeos/login/reset_browsertest.cc
chrome/browser/chromeos/login/saml/saml_browsertest.cc
chrome/browser/chromeos/login/screens/assistant_optin_flow_screen_browsertest.cc
chrome/browser/chromeos/login/screens/recommend_apps/recommend_apps_fetcher_impl.cc
chrome/browser/chromeos/login/screens/recommend_apps/recommend_apps_fetcher_impl_unittest.cc
chrome/browser/chromeos/login/screens/sync_consent_browsertest.cc
chrome/browser/chromeos/login/session/user_session_manager.cc
chrome/browser/chromeos/login/session_login_browsertest.cc
chrome/browser/chromeos/login/signin/device_id_browsertest.cc
chrome/browser/chromeos/login/signin/oauth2_browsertest.cc
chrome/browser/chromeos/login/test/fake_gaia_mixin.cc
chrome/browser/chromeos/login/test/fake_gaia_mixin.h
chrome/browser/chromeos/login/test/login_manager_mixin.cc
chrome/browser/chromeos/login/ui/captive_portal_view.cc
chrome/browser/chromeos/login/ui/user_adding_screen_browsertest.cc
chrome/browser/chromeos/login/users/avatar/user_image_manager_browsertest.cc
chrome/browser/chromeos/login/users/multi_profile_user_controller_unittest.cc
chrome/browser/chromeos/login/users/remove_supervised_users_browsertest.cc
chrome/browser/chromeos/login/users/user_manager_hide_supervised_users_browsertest.cc
chrome/browser/chromeos/login/web_kiosk_controller.cc
chrome/browser/chromeos/net/network_portal_detector_impl_browsertest.cc
chrome/browser/chromeos/plugin_vm/plugin_vm_installer_unittest.cc
chrome/browser/chromeos/plugin_vm/plugin_vm_util.cc
chrome/browser/chromeos/plugin_vm/plugin_vm_util_unittest.cc
chrome/browser/chromeos/policy/active_directory_policy_manager.cc
chrome/browser/chromeos/policy/android_management_client_unittest.cc
chrome/browser/chromeos/policy/device_cloud_policy_browsertest.cc
chrome/browser/chromeos/policy/device_local_account_browsertest.cc
chrome/browser/chromeos/policy/device_local_account_policy_service_unittest.cc
chrome/browser/chromeos/policy/heartbeat_scheduler.cc
chrome/browser/chromeos/policy/powerwash_requirements_checker.cc
chrome/browser/chromeos/policy/remote_commands/crd_host_delegate.cc
chrome/browser/chromeos/policy/status_collector/child_status_collector_browsertest.cc
chrome/browser/chromeos/policy/status_collector/device_status_collector_browsertest.cc
chrome/browser/chromeos/policy/status_collector/status_collector.h
chrome/browser/chromeos/policy/status_uploader_unittest.cc
chrome/browser/chromeos/policy/upload_job_unittest.cc
chrome/browser/chromeos/policy/user_cloud_policy_manager_chromeos_browsertest.cc
chrome/browser/chromeos/policy/user_cloud_policy_manager_chromeos_unittest.cc
chrome/browser/chromeos/policy/user_cloud_policy_store_chromeos_unittest.cc
chrome/browser/chromeos/policy/user_cloud_policy_token_forwarder_unittest.cc
chrome/browser/chromeos/power/auto_screen_brightness/adapter_unittest.cc
chrome/browser/chromeos/power/auto_screen_brightness/modeller_impl_unittest.cc
chrome/browser/chromeos/power/ml/smart_dim/ml_agent_unittest.cc
chrome/browser/chromeos/power/ml/smart_dim/model_unittest.cc
chrome/browser/chromeos/preferences_chromeos_browsertest.cc
chrome/browser/chromeos/printing/specifics_translation_unittest.cc
chrome/browser/chromeos/printing/synced_printers_manager_unittest.cc
chrome/browser/chromeos/proxy_config_service_impl_unittest.cc
chrome/browser/chromeos/release_notes/release_notes_notification_unittest.cc
chrome/browser/chromeos/release_notes/release_notes_storage_unittest.cc
chrome/browser/chromeos/scheduler_configuration_manager.h
chrome/browser/chromeos/settings/cros_settings_unittest.cc
chrome/browser/chromeos/smb_client/smb_service_helper.h
chrome/browser/chromeos/sync/turn_sync_on_helper_unittest.cc
chrome/browser/chromeos/tpm_firmware_update.h
chrome/browser/chromeos/u2f_notification.cc
chrome/browser/complex_tasks/endpoint_fetcher/endpoint_fetcher_unittest.cc
chrome/browser/complex_tasks/task_tab_helper_unittest.cc
chrome/browser/component_updater/recovery_component_installer.cc
chrome/browser/content_settings/content_settings_default_provider_unittest.cc
chrome/browser/content_settings/content_settings_origin_identifier_value_map_unittest.cc
chrome/browser/content_settings/content_settings_policy_provider_unittest.cc
chrome/browser/content_settings/content_settings_pref_provider_unittest.cc
chrome/browser/content_settings/host_content_settings_map_unittest.cc
chrome/browser/content_settings/sound_content_setting_observer_unittest.cc
chrome/browser/content_settings/tab_specific_content_settings_unittest.cc
chrome/browser/custom_handlers/protocol_handler_registry.cc
chrome/browser/custom_handlers/protocol_handler_registry_browsertest.cc
chrome/browser/custom_handlers/protocol_handler_registry_unittest.cc
chrome/browser/data_reduction_proxy/data_reduction_proxy_chrome_settings.cc
chrome/browser/data_reduction_proxy/data_reduction_proxy_chrome_settings_unittest.cc
chrome/browser/devtools/device/adb/adb_client_socket_browsertest.cc
chrome/browser/devtools/device/adb/mock_adb_server.cc
chrome/browser/devtools/devtools_sanity_browsertest.cc
chrome/browser/devtools/devtools_ui_bindings_unittest.cc
chrome/browser/devtools/url_constants.cc
chrome/browser/download/chrome_download_manager_delegate_unittest.cc
chrome/browser/download/download_browsertest.cc
chrome/browser/download/mixed_content_download_blocking.cc
chrome/browser/download/save_page_browsertest.cc
chrome/browser/engagement/important_sites_usage_counter_unittest.cc
chrome/browser/engagement/important_sites_util_unittest.cc
chrome/browser/engagement/site_engagement_helper.cc
chrome/browser/engagement/site_engagement_helper_unittest.cc
chrome/browser/engagement/site_engagement_score_unittest.cc
chrome/browser/engagement/site_engagement_service_unittest.cc
chrome/browser/enterprise/connectors/connectors_manager_unittest.cc
chrome/browser/enterprise/reporting/notification/extension_request_notification.cc
chrome/browser/enterprise/reporting/notification/extension_request_notification_unittest.cc
chrome/browser/enterprise/reporting/notification/extension_request_observer_unittest.cc
chrome/browser/extensions/active_tab_apitest.cc
chrome/browser/extensions/active_tab_unittest.cc
chrome/browser/extensions/activity_log/activity_log_policy_unittest.cc
chrome/browser/extensions/activity_log/activity_log_unittest.cc
chrome/browser/extensions/activity_log/counting_policy_unittest.cc
chrome/browser/extensions/activity_log/fullstream_ui_policy_unittest.cc
chrome/browser/extensions/api/README.txt
chrome/browser/extensions/api/activity_log_private/activity_log_private_api_unittest.cc
chrome/browser/extensions/api/autofill_assistant_private/autofill_assistant_private_api.cc
chrome/browser/extensions/api/bookmark_manager_private/bookmark_manager_private_api_unittest.cc
chrome/browser/extensions/api/bookmarks/bookmark_api_helpers_unittest.cc
chrome/browser/extensions/api/bookmarks/bookmark_apitest.cc
chrome/browser/extensions/api/braille_display_private/braille_display_private_apitest.cc
chrome/browser/extensions/api/cloud_print_private/cloud_print_private_apitest.cc
chrome/browser/extensions/api/content_settings/content_settings_apitest.cc
chrome/browser/extensions/api/content_settings/content_settings_store_unittest.cc
chrome/browser/extensions/api/content_settings/content_settings_unittest.cc
chrome/browser/extensions/api/cryptotoken_private/cryptotoken_private_api.cc
chrome/browser/extensions/api/cryptotoken_private/cryptotoken_private_api_unittest.cc
chrome/browser/extensions/api/declarative_content/chrome_content_rules_registry.h
chrome/browser/extensions/api/declarative_content/content_action.h
chrome/browser/extensions/api/declarative_content/content_condition.h
chrome/browser/extensions/api/declarative_content/content_predicate.h
chrome/browser/extensions/api/declarative_content/content_predicate_evaluator.h
chrome/browser/extensions/api/declarative_content/declarative_content_page_url_condition_tracker_unittest.cc
chrome/browser/extensions/api/declarative_net_request/declarative_net_request_browsertest.cc
chrome/browser/extensions/api/declarative_net_request/rule_indexing_unittest.cc
chrome/browser/extensions/api/declarative_net_request/ruleset_manager_unittest.cc
chrome/browser/extensions/api/declarative_webrequest/webrequest_action_unittest.cc
chrome/browser/extensions/api/declarative_webrequest/webrequest_rules_registry_unittest.cc
chrome/browser/extensions/api/developer_private/developer_private_api_unittest.cc
chrome/browser/extensions/api/developer_private/extension_info_generator.cc
chrome/browser/extensions/api/developer_private/extension_info_generator_unittest.cc
chrome/browser/extensions/api/downloads/downloads_api.h
chrome/browser/extensions/api/downloads/downloads_api_browsertest.cc
chrome/browser/extensions/api/enterprise_platform_keys/enterprise_platform_keys_api_unittest.cc
chrome/browser/extensions/api/enterprise_platform_keys/enterprise_platform_keys_apitest_nss.cc
chrome/browser/extensions/api/enterprise_platform_keys_private/enterprise_platform_keys_private_api_unittest.cc
chrome/browser/extensions/api/enterprise_reporting_private/device_info_fetcher_win.cc
chrome/browser/extensions/api/extension_action/browser_action_apitest.cc
chrome/browser/extensions/api/feedback_private/feedback_browsertest.cc
chrome/browser/extensions/api/identity/gaia_web_auth_flow.h
chrome/browser/extensions/api/identity/gaia_web_auth_flow_unittest.cc
chrome/browser/extensions/api/identity/identity_apitest.cc
chrome/browser/extensions/api/identity/identity_launch_web_auth_flow_function.cc
chrome/browser/extensions/api/image_writer_private/removable_storage_provider.cc
chrome/browser/extensions/api/image_writer_private/removable_storage_provider_linux.cc
chrome/browser/extensions/api/management/chrome_management_api_delegate.cc
chrome/browser/extensions/api/management/management_api_unittest.cc
chrome/browser/extensions/api/passwords_private/password_check_delegate_unittest.cc
chrome/browser/extensions/api/passwords_private/passwords_private_delegate_impl_unittest.cc
chrome/browser/extensions/api/passwords_private/passwords_private_utils_unittest.cc
chrome/browser/extensions/api/permissions/permissions_api_helpers.cc
chrome/browser/extensions/api/permissions/permissions_api_unittest.cc
chrome/browser/extensions/api/preference/preference_api_prefs_unittest.cc
chrome/browser/extensions/api/proxy/proxy_api_helpers_unittest.cc
chrome/browser/extensions/api/runtime/runtime_apitest.cc
chrome/browser/extensions/api/settings_overrides/settings_overrides_browsertest.cc
chrome/browser/extensions/api/tab_capture/tab_capture_apitest.cc
chrome/browser/extensions/api/tab_capture/tab_capture_performance_test_base.cc
chrome/browser/extensions/api/tabs/tabs_api_unittest.cc
chrome/browser/extensions/api/tabs/tabs_test.cc
chrome/browser/extensions/api/web_navigation/frame_navigation_state_unittest.cc
chrome/browser/extensions/api/web_request/web_request_api_unittest.cc
chrome/browser/extensions/api/web_request/web_request_apitest.cc
chrome/browser/extensions/api/web_request/web_request_permissions_unittest.cc
chrome/browser/extensions/api/webrtc_audio_private/webrtc_audio_private_browsertest.cc
chrome/browser/extensions/api/webstore_private/extension_install_status_unittest.cc
chrome/browser/extensions/api/webstore_private/webstore_private_apitest.cc
chrome/browser/extensions/background_xhr_browsertest.cc
chrome/browser/extensions/chrome_extension_browser_constants.cc
chrome/browser/extensions/chrome_extension_function_details.cc
chrome/browser/extensions/chrome_info_map_unittest.cc
chrome/browser/extensions/component_extensions_whitelist/whitelist.h
chrome/browser/extensions/content_script_apitest.cc
chrome/browser/extensions/convert_user_script_unittest.cc
chrome/browser/extensions/corb_and_cors_extension_browsertest.cc
chrome/browser/extensions/crx_installer_browsertest.cc
chrome/browser/extensions/extension_action_runner_unittest.cc
chrome/browser/extensions/extension_browser_window_helper.cc
chrome/browser/extensions/extension_browsertest.cc
chrome/browser/extensions/extension_context_menu_browsertest.cc
chrome/browser/extensions/extension_context_menu_model_unittest.cc
chrome/browser/extensions/extension_loading_browsertest.cc
chrome/browser/extensions/extension_message_bubble_controller_unittest.cc
chrome/browser/extensions/extension_messages_apitest.cc
chrome/browser/extensions/extension_override_apitest.cc
chrome/browser/extensions/extension_prefs_unittest.cc
chrome/browser/extensions/extension_service_sync_unittest.cc
chrome/browser/extensions/extension_service_unittest.cc
chrome/browser/extensions/extension_sync_data_unittest.cc
chrome/browser/extensions/extension_tab_util_browsertest.cc
chrome/browser/extensions/extension_tab_util_unittest.cc
chrome/browser/extensions/extension_unload_browsertest.cc
chrome/browser/extensions/extension_user_script_loader_unittest.cc
chrome/browser/extensions/external_policy_loader_unittest.cc
chrome/browser/extensions/external_pref_loader.cc
chrome/browser/extensions/external_provider_impl_chromeos_unittest.cc
chrome/browser/extensions/external_provider_impl_unittest.cc
chrome/browser/extensions/forced_extensions/installation_tracker_unittest.cc
chrome/browser/extensions/install_signer.cc
chrome/browser/extensions/installed_loader_unittest.cc
chrome/browser/extensions/lazy_background_page_apitest.cc
chrome/browser/extensions/menu_manager_unittest.cc
chrome/browser/extensions/navigation_observer.cc
chrome/browser/extensions/navigation_observer_browsertest.cc
chrome/browser/extensions/permission_message_combinations_unittest.cc
chrome/browser/extensions/permissions_updater.cc
chrome/browser/extensions/permissions_updater_unittest.cc
chrome/browser/extensions/policy_handlers_unittest.cc
chrome/browser/extensions/scripting_permissions_modifier.cc
chrome/browser/extensions/scripting_permissions_modifier.h
chrome/browser/extensions/scripting_permissions_modifier_unittest.cc
chrome/browser/extensions/updater/chrome_extension_downloader_factory.cc
chrome/browser/extensions/updater/chrome_update_client_config.cc
chrome/browser/extensions/updater/extension_updater_unittest.cc
chrome/browser/extensions/user_script_listener_unittest.cc
chrome/browser/feature_engagement/new_tab/new_tab_tracker_browsertest.cc
chrome/browser/feedback/feedback_uploader_chrome.cc
chrome/browser/feedback/show_feedback_page.cc
chrome/browser/feedback/system_logs/log_sources/crash_ids_source.cc
chrome/browser/flag-metadata.json
chrome/browser/google/google_search_domain_mixing_metrics_emitter.h
chrome/browser/google/google_search_domain_mixing_metrics_emitter_factory.h
chrome/browser/google/google_search_domain_mixing_metrics_emitter_unittest.cc
chrome/browser/google/google_update_win.cc
chrome/browser/guest_view/web_view/context_menu_content_type_web_view.cc
chrome/browser/hid/hid_chooser_context_unittest.cc
chrome/browser/history/android/android_history_provider_service_unittest.cc
chrome/browser/history/android/android_provider_backend_unittest.cc
chrome/browser/history/android/android_urls_database_unittest.cc
chrome/browser/history/android/bookmark_model_sql_handler_unittest.cc
chrome/browser/history/android/sqlite_cursor_unittest.cc
chrome/browser/history/android/urls_sql_handler_unittest.cc
chrome/browser/history/android/visit_sql_handler_unittest.cc
chrome/browser/history/domain_diversity_reporter_unittest.cc
chrome/browser/history/redirect_browsertest.cc
chrome/browser/importer/edge_importer_browsertest_win.cc
chrome/browser/importer/firefox_importer_browsertest.cc
chrome/browser/importer/firefox_profile_lock.cc
chrome/browser/importer/firefox_profile_lock.h
chrome/browser/importer/firefox_profile_lock_posix.cc
chrome/browser/importer/firefox_profile_lock_win.cc
chrome/browser/importer/ie_importer_browsertest_win.cc
chrome/browser/importer/profile_writer_unittest.cc
chrome/browser/installable/installable_manager_browsertest.cc
chrome/browser/lifetime/switch_utils_unittest.cc
chrome/browser/local_discovery/service_discovery_client_mac.mm
chrome/browser/lookalikes/lookalike_url_navigation_throttle.cc
chrome/browser/lookalikes/lookalike_url_navigation_throttle_browsertest.cc
chrome/browser/media/android/remote/flinging_controller_bridge.cc
chrome/browser/media/feeds/media_feeds_fetcher_unittest.cc
chrome/browser/media/history/media_history_keyed_service_unittest.cc
chrome/browser/media/history/media_history_store_unittest.cc
chrome/browser/media/media_engagement_contents_observer_unittest.cc
chrome/browser/media/media_engagement_preloaded_list_unittest.cc
chrome/browser/media/media_engagement_score_unittest.cc
chrome/browser/media/media_engagement_service_unittest.cc
chrome/browser/media/media_engagement_session_unittest.cc
chrome/browser/media/protected_media_identifier_permission_context.cc
chrome/browser/media/router/discovery/dial/safe_dial_device_description_parser_unittest.cc
chrome/browser/media/router/discovery/discovery_network_list_win.cc
chrome/browser/media/router/media_router_dialog_controller_unittest.cc
chrome/browser/media/router/media_router_metrics_unittest.cc
chrome/browser/media/router/media_sinks_observer.h
chrome/browser/media/router/mojo/media_router_mojo_impl_unittest.cc
chrome/browser/media/router/presentation/local_presentation_manager.h
chrome/browser/media/router/presentation/presentation_media_sinks_observer_unittest.cc
chrome/browser/media/router/providers/cast/cast_activity_manager_unittest.cc
chrome/browser/media/router/providers/cast/cast_media_route_provider.cc
chrome/browser/media/router/providers/cast/cast_media_route_provider_unittest.cc
chrome/browser/media/router/providers/dial/dial_media_route_provider.cc
chrome/browser/media/router/providers/dial/dial_media_route_provider_unittest.cc
chrome/browser/media/webrtc/webrtc_browsertest_common.cc
chrome/browser/media/webrtc/webrtc_event_log_uploader.cc
chrome/browser/media/webrtc/webrtc_log_uploader.cc
chrome/browser/media_galleries/fileapi/media_path_filter.cc
chrome/browser/memory/swap_thrashing_monitor_delegate_win.cc
chrome/browser/metrics/testing/sync_metrics_test_utils.cc
chrome/browser/metrics/thread_watcher_android.h
chrome/browser/nacl_host/nacl_infobar_delegate.cc
chrome/browser/navigation_predictor/navigation_predictor_browsertest.cc
chrome/browser/navigation_predictor/navigation_predictor_unittest.cc
chrome/browser/navigation_predictor/search_engine_preconnector.cc
chrome/browser/navigation_predictor/search_engine_preconnector_browsertest.cc
chrome/browser/net/cert_verify_proc_browsertest.cc
chrome/browser/net/dns_probe_runner.cc
chrome/browser/net/dns_probe_runner.h
chrome/browser/net/nss_context_chromeos_browsertest.cc
chrome/browser/net/proxy_browsertest.cc
chrome/browser/net/service_providers_win.cc
chrome/browser/net/trial_comparison_cert_verifier_controller.cc
chrome/browser/net/variations_http_headers_browsertest.cc
chrome/browser/no_best_effort_tasks_browsertest.cc
chrome/browser/notifications/chrome_ash_message_center_client_unittest.cc
chrome/browser/notifications/notification_channels_provider_android_unittest.cc
chrome/browser/notifications/notification_permission_context.h
chrome/browser/notifications/notification_permission_context_unittest.cc
chrome/browser/notifications/notification_platform_bridge_linux_unittest.cc
chrome/browser/notifications/notification_platform_bridge_mac.mm
chrome/browser/notifications/notification_platform_bridge_mac_unittest.mm
chrome/browser/notifications/notification_platform_bridge_win_unittest.cc
chrome/browser/notifications/platform_notification_service_unittest.cc
chrome/browser/notifications/win/notification_template_builder.cc
chrome/browser/optimization_guide/hints_fetcher_browsertest.cc
chrome/browser/optimization_guide/optimization_guide_hints_manager_unittest.cc
chrome/browser/optimization_guide/prediction/prediction_manager_browsertest.cc
chrome/browser/page_load_metrics/observers/aborts_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/ad_metrics/ads_page_load_metrics_observer.cc
chrome/browser/page_load_metrics/observers/ad_metrics/ads_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/amp_page_load_metrics_observer.h
chrome/browser/page_load_metrics/observers/amp_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/data_reduction_proxy_metrics_observer_test_utils.h
chrome/browser/page_load_metrics/observers/document_write_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/from_gws_page_load_metrics_observer.cc
chrome/browser/page_load_metrics/observers/from_gws_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/isolated_prerender_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/live_tab_count_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/loading_predictor_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/local_network_requests_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/media_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/multi_tab_loading_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/offline_page_previews_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/page_load_metrics_observer_test_harness.cc
chrome/browser/page_load_metrics/observers/previews_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/previews_ukm_observer_unittest.cc
chrome/browser/page_load_metrics/observers/protocol_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/scheme_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/service_worker_page_load_metrics_observer.cc
chrome/browser/page_load_metrics/observers/service_worker_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/session_restore_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/tab_restore_page_load_metrics_observer_unittest.cc
chrome/browser/page_load_metrics/observers/ukm_page_load_metrics_observer_unittest.cc
chrome/browser/password_manager/chrome_password_manager_client_unittest.cc
chrome/browser/password_manager/password_manager_browsertest.cc
chrome/browser/password_manager/password_manager_util_win.cc
chrome/browser/payments/hybrid_request_skip_ui_browsertest.cc
chrome/browser/payments/journey_logger_browsertest.cc
chrome/browser/payments/manifest_verifier_browsertest.cc
chrome/browser/payments/service_worker_payment_app_finder_browsertest.cc
chrome/browser/pdf/pdf_extension_test.cc
chrome/browser/pepper_broker_infobar_delegate.cc
chrome/browser/performance_manager/observers/metrics_collector_unittest.cc
chrome/browser/permissions/chrome_permission_manager_unittest.cc
chrome/browser/permissions/chrome_permission_request_manager_unittest.cc
chrome/browser/permissions/permission_context_base_feature_policy_unittest.cc
chrome/browser/platform_util_chromeos.cc
chrome/browser/platform_util_win.cc
chrome/browser/plugins/chrome_plugin_service_filter_unittest.cc
chrome/browser/plugins/flash_temporary_permission_tracker_unittest.cc
chrome/browser/plugins/plugin_info_host_impl_unittest.cc
chrome/browser/plugins/plugins_resource_service.cc
chrome/browser/policy/cloud/cloud_policy_browsertest.cc
chrome/browser/policy/cloud/user_policy_signin_service_unittest.cc
chrome/browser/policy/extension_policy_browsertest.cc
chrome/browser/policy/policy_browsertest.cc
chrome/browser/policy/policy_test_utils.cc
chrome/browser/policy/policy_test_utils.h
chrome/browser/policy/safe_browsing_policy_browsertest.cc
chrome/browser/policy/webusb_allow_devices_for_urls_policy_handler_unittest.cc
chrome/browser/predictors/autocomplete_action_predictor_table_unittest.cc
chrome/browser/predictors/loading_data_collector_unittest.cc
chrome/browser/predictors/loading_predictor_browsertest.cc
chrome/browser/predictors/loading_predictor_unittest.cc
chrome/browser/predictors/loading_stats_collector_unittest.cc
chrome/browser/predictors/preconnect_manager_unittest.cc
chrome/browser/predictors/resource_prefetch_predictor_tables_unittest.cc
chrome/browser/predictors/resource_prefetch_predictor_unittest.cc
chrome/browser/prefetch/prefetch_browsertest.cc
chrome/browser/prefs/chrome_command_line_pref_store_proxy_unittest.cc
chrome/browser/prefs/pref_functional_browsertest.cc
chrome/browser/prefs/pref_metrics_service.cc
chrome/browser/prefs/session_startup_pref_unittest.cc
chrome/browser/prerender/isolated/isolated_prerender_browsertest.cc
chrome/browser/prerender/isolated/isolated_prerender_tab_helper_unittest.cc
chrome/browser/prerender/prerender_browsertest.cc
chrome/browser/prerender/prerender_nostate_prefetch_browsertest.cc
chrome/browser/prerender/prerender_unittest.cc
chrome/browser/prerender/prerender_util_unittest.cc
chrome/browser/prerender/tools/prerender_test_server/index.html
chrome/browser/prerender/tools/prerender_test_server/prerender_test_server.py
chrome/browser/previews/previews_content_util_unittest.cc
chrome/browser/previews/previews_lite_page_redirect_browsertest.cc
chrome/browser/previews/previews_lite_page_redirect_decider_unittest.cc
chrome/browser/previews/previews_lite_page_redirect_url_loader_interceptor_unittest.cc
chrome/browser/previews/previews_offline_helper_unittest.cc
chrome/browser/printing/cloud_print/cloud_print_printer_list_unittest.cc
chrome/browser/printing/cloud_print/cloud_print_proxy_service_unittest.cc
chrome/browser/printing/cloud_print/gcd_api_flow_unittest.cc
chrome/browser/printing/cloud_print/privet_confirm_api_flow_unittest.cc
chrome/browser/printing/cloud_print/privet_http_unittest.cc
chrome/browser/printing/print_preview_dialog_controller_browsertest.cc
chrome/browser/printing/print_preview_dialog_controller_unittest.cc
chrome/browser/process_singleton_posix.cc
chrome/browser/process_singleton_posix_unittest.cc
chrome/browser/process_singleton_win.cc
chrome/browser/profile_resetter/profile_resetter_unittest.cc
chrome/browser/profile_resetter/reset_report_uploader.cc
chrome/browser/profiles/gaia_info_update_service_unittest.cc
chrome/browser/profiles/profile.h
chrome/browser/profiles/profile_attributes_entry.h
chrome/browser/profiles/profile_attributes_storage_unittest.cc
chrome/browser/profiles/profile_avatar_downloader.cc
chrome/browser/profiles/profile_downloader_unittest.cc
chrome/browser/profiles/profile_impl.cc
chrome/browser/profiles/profile_impl.h
chrome/browser/profiles/profile_manager.h
chrome/browser/profiles/profile_manager_browsertest.cc
chrome/browser/profiles/profile_manager_unittest.cc
chrome/browser/profiles/profile_shortcut_manager_win.cc
chrome/browser/profiles/profile_window.cc
chrome/browser/push_messaging/push_messaging_constants.cc
chrome/browser/push_messaging/push_messaging_notification_manager.h
chrome/browser/push_messaging/push_messaging_notification_manager_unittest.cc
chrome/browser/push_messaging/push_messaging_service_impl.cc
chrome/browser/renderer_context_menu/render_view_context_menu_browsertest.cc
chrome/browser/reputation/local_heuristics.cc
chrome/browser/resource_coordinator/tab_load_tracker_unittest.cc
chrome/browser/resource_coordinator/tab_manager_web_contents_data_unittest.cc
chrome/browser/resource_coordinator/tab_metrics_logger_unittest.cc
chrome/browser/resource_coordinator/tab_ranker/tab_features_test_helper.cc
chrome/browser/resources/PRESUBMIT.py
chrome/browser/resources/bookmarks/command_manager.js
chrome/browser/resources/chromeos/about_os_credits.html
chrome/browser/resources/chromeos/accessibility/chromevox/background/annotation/user_annotation_handler.js
chrome/browser/resources/chromeos/accessibility/chromevox/background/background_test.js
chrome/browser/resources/chromeos/accessibility/chromevox/background/command_handler.js
chrome/browser/resources/chromeos/accessibility/chromevox/background/cursors_test.js
chrome/browser/resources/chromeos/accessibility/chromevox/background/locale_output_helper_test.js
chrome/browser/resources/chromeos/accessibility/chromevox/background/prefs.js
chrome/browser/resources/chromeos/accessibility/chromevox/braille/braille_input_handler_test.js
chrome/browser/resources/chromeos/accessibility/chromevox/common/spannable_test.js
chrome/browser/resources/chromeos/accessibility/chromevox/options/options.css
chrome/browser/resources/chromeos/accessibility/chromevox/options/options.js
chrome/browser/resources/chromeos/accessibility/chromevox/panel/panel.html
chrome/browser/resources/chromeos/accessibility/chromevox/panel/panel.js
chrome/browser/resources/chromeos/accessibility/chromevox/panel/tutorial.js
chrome/browser/resources/chromeos/accessibility/chromevox/tools/webstore_extension_util.py
chrome/browser/resources/chromeos/accessibility/chromevox_manifest.json.jinja2
chrome/browser/resources/chromeos/accessibility/select_to_speak/mock_tts.js
chrome/browser/resources/chromeos/accessibility/select_to_speak/options.css
chrome/browser/resources/chromeos/accessibility/select_to_speak/paragraph_utils_unittest.js
chrome/browser/resources/chromeos/accessibility/select_to_speak/select_to_speak.js
chrome/browser/resources/chromeos/accessibility/select_to_speak/select_to_speak_unittest.js
chrome/browser/resources/chromeos/accessibility/select_to_speak/test_support.js
chrome/browser/resources/chromeos/accessibility/select_to_speak_manifest.json.jinja2
chrome/browser/resources/chromeos/accessibility/switch_access/switch_access.js
chrome/browser/resources/chromeos/accessibility/switch_access/switch_access_predicate_test.js
chrome/browser/resources/chromeos/add_supervision/add_supervision.js
chrome/browser/resources/chromeos/arc_support/background.js
chrome/browser/resources/chromeos/arc_support/playstore.js
chrome/browser/resources/chromeos/assistant_optin/assistant_value_prop.js
chrome/browser/resources/chromeos/camera/src/js/lib/google-analytics-bundle.js
chrome/browser/resources/chromeos/camera/src/js/metrics.js
chrome/browser/resources/chromeos/camera/src/js/util.js
chrome/browser/resources/chromeos/camera/src/js/views/camera_intent.js
chrome/browser/resources/chrome
Download .txt
gitextract_xa9r_bzi/

├── .cirrus.yml
├── .cirrus_Dockerfile
├── .cirrus_requirements.txt
├── .github/
│   ├── ISSUE_TEMPLATE/
│   │   ├── bugreport.md
│   │   ├── create-an--updating-to-chromium-x-x-x-x-.md
│   │   ├── feature_request.md
│   │   └── other.md
│   └── PULL_REQUEST_TEMPLATE.md
├── .gitignore
├── .style.yapf
├── LICENSE
├── README.md
├── SUPPORT.md
├── chromium_version.txt
├── devutils/
│   ├── .coveragerc
│   ├── README.md
│   ├── __init__.py
│   ├── check_all_code.sh
│   ├── check_downloads_ini.py
│   ├── check_files_exist.py
│   ├── check_gn_flags.py
│   ├── check_patch_files.py
│   ├── print_tag_version.sh
│   ├── pytest.ini
│   ├── run_devutils_pylint.py
│   ├── run_devutils_tests.sh
│   ├── run_devutils_yapf.sh
│   ├── run_other_pylint.py
│   ├── run_other_yapf.sh
│   ├── run_utils_pylint.py
│   ├── run_utils_tests.sh
│   ├── run_utils_yapf.sh
│   ├── set_quilt_vars.sh
│   ├── tests/
│   │   ├── __init__.py
│   │   ├── test_check_patch_files.py
│   │   └── test_validate_patches.py
│   ├── third_party/
│   │   ├── README.md
│   │   ├── __init__.py
│   │   └── unidiff/
│   │       ├── __init__.py
│   │       ├── __version__.py
│   │       ├── constants.py
│   │       ├── errors.py
│   │       └── patch.py
│   ├── update_lists.py
│   ├── update_platform_patches.py
│   ├── validate_config.py
│   └── validate_patches.py
├── docs/
│   ├── building.md
│   ├── contributing.md
│   ├── design.md
│   ├── developing.md
│   ├── flags.md
│   ├── platforms.md
│   └── repo_management.md
├── domain_regex.list
├── domain_substitution.list
├── downloads.ini
├── flags.gn
├── patches/
│   ├── core/
│   │   ├── bromite/
│   │   │   └── disable-fetching-field-trials.patch
│   │   ├── debian/
│   │   │   └── disable/
│   │   │       └── unrar.patch
│   │   ├── inox-patchset/
│   │   │   ├── 0001-fix-building-without-safebrowsing.patch
│   │   │   ├── 0003-disable-autofill-download-manager.patch
│   │   │   ├── 0005-disable-default-extensions.patch
│   │   │   ├── 0007-disable-web-resource-service.patch
│   │   │   ├── 0009-disable-google-ipv6-probes.patch
│   │   │   ├── 0014-disable-translation-lang-fetch.patch
│   │   │   ├── 0015-disable-update-pings.patch
│   │   │   ├── 0017-disable-new-avatar-menu.patch
│   │   │   └── 0021-disable-rlz.patch
│   │   ├── iridium-browser/
│   │   │   ├── all-add-trk-prefixes-to-possibly-evil-connections.patch
│   │   │   ├── safe_browsing-disable-incident-reporting.patch
│   │   │   └── safe_browsing-disable-reporting-of-safebrowsing-over.patch
│   │   └── ungoogled-chromium/
│   │       ├── block-trk-and-subdomains.patch
│   │       ├── disable-crash-reporter.patch
│   │       ├── disable-domain-reliability.patch
│   │       ├── disable-fonts-googleapis-references.patch
│   │       ├── disable-gaia.patch
│   │       ├── disable-gcm.patch
│   │       ├── disable-google-host-detection.patch
│   │       ├── disable-mei-preload.patch
│   │       ├── disable-network-time-tracker.patch
│   │       ├── disable-profile-avatar-downloading.patch
│   │       ├── disable-signin.patch
│   │       ├── disable-translate.patch
│   │       ├── disable-untraceable-urls.patch
│   │       ├── disable-webrtc-log-uploader.patch
│   │       ├── disable-webstore-urls.patch
│   │       ├── fix-building-without-enabling-reporting.patch
│   │       ├── fix-building-without-one-click-signin.patch
│   │       ├── fix-building-without-safebrowsing.patch
│   │       ├── fix-learn-doubleclick-hsts.patch
│   │       ├── remove-unused-preferences-fields.patch
│   │       ├── replace-google-search-engine-with-nosearch.patch
│   │       └── use-local-devtools-files.patch
│   ├── extra/
│   │   ├── bromite/
│   │   │   ├── fingerprinting-flags-client-rects-and-measuretext.patch
│   │   │   ├── flag-fingerprinting-canvas-image-data-noise.patch
│   │   │   └── flag-max-connections-per-host.patch
│   │   ├── debian/
│   │   │   ├── disable/
│   │   │   │   ├── android.patch
│   │   │   │   ├── device-notifications.patch
│   │   │   │   ├── fuzzers.patch
│   │   │   │   ├── google-api-warning.patch
│   │   │   │   └── welcome-page.patch
│   │   │   ├── fixes/
│   │   │   │   └── connection-message.patch
│   │   │   ├── gn/
│   │   │   │   └── parallel.patch
│   │   │   └── warnings/
│   │   │       └── initialization.patch
│   │   ├── inox-patchset/
│   │   │   ├── 0006-modify-default-prefs.patch
│   │   │   ├── 0008-restore-classic-ntp.patch
│   │   │   ├── 0011-add-duckduckgo-search-engine.patch
│   │   │   ├── 0013-disable-missing-key-warning.patch
│   │   │   ├── 0016-chromium-sandbox-pie.patch
│   │   │   ├── 0018-disable-first-run-behaviour.patch
│   │   │   └── 0019-disable-battery-status-service.patch
│   │   ├── iridium-browser/
│   │   │   ├── Remove-EV-certificates.patch
│   │   │   ├── browser-disable-profile-auto-import-on-first-run.patch
│   │   │   ├── mime_util-force-text-x-suse-ymp-to-be-downloaded.patch
│   │   │   ├── net-cert-increase-default-key-length-for-newly-gener.patch
│   │   │   ├── prefs-always-prompt-for-download-directory-by-defaul.patch
│   │   │   ├── prefs-only-keep-cookies-until-exit.patch
│   │   │   └── updater-disable-auto-update.patch
│   │   └── ungoogled-chromium/
│   │       ├── add-components-ungoogled.patch
│   │       ├── add-flag-for-pdf-plugin-name.patch
│   │       ├── add-flag-for-search-engine-collection.patch
│   │       ├── add-flag-to-configure-extension-downloading.patch
│   │       ├── add-flag-to-disable-beforeunload.patch
│   │       ├── add-flag-to-force-punycode-hostnames.patch
│   │       ├── add-flag-to-hide-crashed-bubble.patch
│   │       ├── add-flag-to-scroll-tabs.patch
│   │       ├── add-flag-to-show-avatar-button.patch
│   │       ├── add-flag-to-stack-tabs.patch
│   │       ├── add-ipv6-probing-option.patch
│   │       ├── add-suggestions-url-field.patch
│   │       ├── default-to-https-scheme.patch
│   │       ├── disable-download-quarantine.patch
│   │       ├── disable-formatting-in-omnibox.patch
│   │       ├── disable-intranet-redirect-detector.patch
│   │       ├── disable-webgl-renderer-info.patch
│   │       ├── enable-checkbox-external-protocol.patch
│   │       ├── enable-page-saving-on-more-pages.patch
│   │       ├── enable-paste-and-go-new-tab-button.patch
│   │       ├── fix-building-without-mdns-and-service-discovery.patch
│   │       ├── popups-to-tabs.patch
│   │       ├── remove-disable-setuid-sandbox-as-bad-flag.patch
│   │       └── searx.patch
│   └── series
├── pruning.list
├── revision.txt
└── utils/
    ├── .coveragerc
    ├── __init__.py
    ├── _common.py
    ├── _extraction.py
    ├── domain_substitution.py
    ├── downloads.py
    ├── filescfg.py
    ├── patches.py
    ├── prune_binaries.py
    ├── pytest.ini
    ├── tests/
    │   ├── __init__.py
    │   ├── test_domain_substitution.py
    │   └── test_patches.py
    └── third_party/
        ├── README.md
        ├── __init__.py
        └── schema.py
Download .txt
SYMBOL INDEX (245 symbols across 25 files)

FILE: devutils/check_downloads_ini.py
  function check_downloads_ini (line 27) | def check_downloads_ini(downloads_ini_iter):
  function main (line 42) | def main():

FILE: devutils/check_files_exist.py
  function main (line 17) | def main():

FILE: devutils/check_gn_flags.py
  function check_gn_flags (line 27) | def check_gn_flags(gn_flags_path):
  function main (line 59) | def main():

FILE: devutils/check_patch_files.py
  function _read_series_file (line 33) | def _read_series_file(patches_dir, series_file, join_dir=False):
  function check_patch_readability (line 49) | def check_patch_readability(patches_dir, series_path=Path('series')):
  function check_unused_patches (line 72) | def check_unused_patches(patches_dir, series_path=Path('series')):
  function check_series_duplicates (line 97) | def check_series_duplicates(patches_dir, series_path=Path('series')):
  function main (line 114) | def main():

FILE: devutils/run_devutils_pylint.py
  function main (line 15) | def main():

FILE: devutils/run_other_pylint.py
  class ChangeDir (line 16) | class ChangeDir:
    method __init__ (line 21) | def __init__(self, path):
    method __enter__ (line 25) | def __enter__(self):
    method __exit__ (line 28) | def __exit__(self, *_):
  function run_pylint (line 32) | def run_pylint(module_path, pylint_options, ignore_prefixes=tuple()):
  function main (line 66) | def main():

FILE: devutils/run_utils_pylint.py
  function main (line 15) | def main():

FILE: devutils/tests/test_check_patch_files.py
  function test_check_series_duplicates (line 14) | def test_check_series_duplicates():

FILE: devutils/tests/test_validate_patches.py
  function test_test_patches (line 20) | def test_test_patches(caplog):

FILE: devutils/third_party/unidiff/errors.py
  class UnidiffParseError (line 30) | class UnidiffParseError(Exception):

FILE: devutils/third_party/unidiff/patch.py
  function implements_to_string (line 56) | def implements_to_string(cls):
  class Line (line 70) | class Line(object):
    method __init__ (line 73) | def __init__(self, value, line_type,
    method __repr__ (line 82) | def __repr__(self):
    method __str__ (line 85) | def __str__(self):
    method __eq__ (line 88) | def __eq__(self, other):
    method is_added (line 96) | def is_added(self):
    method is_removed (line 100) | def is_removed(self):
    method is_context (line 104) | def is_context(self):
  class PatchInfo (line 109) | class PatchInfo(list):
    method __repr__ (line 117) | def __repr__(self):
    method __str__ (line 121) | def __str__(self):
  class Hunk (line 126) | class Hunk(list):
    method __init__ (line 129) | def __init__(self, src_start=0, src_len=0, tgt_start=0, tgt_len=0,
    method __repr__ (line 145) | def __repr__(self):
    method __str__ (line 153) | def __str__(self):
    method append (line 162) | def append(self, line):
    method is_valid (line 176) | def is_valid(self):
    method source_lines (line 181) | def source_lines(self):
    method target_lines (line 185) | def target_lines(self):
  class PatchedFile (line 190) | class PatchedFile(list):
    method __init__ (line 193) | def __init__(self, patch_info=None, source='', target='',
    method __repr__ (line 202) | def __repr__(self):
    method __str__ (line 205) | def __str__(self):
    method _parse_hunk (line 217) | def _parse_hunk(self, header, diff, encoding):
    method _add_no_newline_marker_to_last_hunk (line 281) | def _add_no_newline_marker_to_last_hunk(self):
    method _append_trailing_empty_line (line 289) | def _append_trailing_empty_line(self):
    method path (line 296) | def path(self):
    method added (line 312) | def added(self):
    method removed (line 317) | def removed(self):
    method is_added_file (line 322) | def is_added_file(self):
    method is_removed_file (line 328) | def is_removed_file(self):
    method is_modified_file (line 334) | def is_modified_file(self):
  class PatchSet (line 340) | class PatchSet(list):
    method __init__ (line 343) | def __init__(self, f, encoding=None):
    method __repr__ (line 355) | def __repr__(self):
    method __str__ (line 358) | def __str__(self):
    method _parse (line 361) | def _parse(self, diff, encoding):
    method from_filename (line 422) | def from_filename(cls, filename, encoding=DEFAULT_ENCODING, errors=None):
    method _convert_string (line 429) | def _convert_string(data, encoding=None, errors='strict'):
    method from_string (line 436) | def from_string(cls, data, encoding=None, errors='strict'):
    method added_files (line 441) | def added_files(self):
    method removed_files (line 446) | def removed_files(self):
    method modified_files (line 451) | def modified_files(self):
    method added (line 456) | def added(self):
    method removed (line 461) | def removed(self):

FILE: devutils/update_lists.py
  class UnusedPatterns (line 115) | class UnusedPatterns: #pylint: disable=too-few-public-methods
    method __init__ (line 121) | def __init__(self):
    method log_unused (line 127) | def log_unused(self):
  function _is_binary (line 142) | def _is_binary(bytes_data):
  function _dir_empty (line 150) | def _dir_empty(path):
  function should_prune (line 163) | def should_prune(path, relative_path, unused_patterns):
  function _check_regex_match (line 192) | def _check_regex_match(file_path, search_regex):
  function should_domain_substitute (line 213) | def should_domain_substitute(path, relative_path, search_regex, unused_p...
  function compute_lists (line 234) | def compute_lists(source_tree, search_regex):
  function main (line 288) | def main(args_list=None):

FILE: devutils/update_platform_patches.py
  function merge_platform_patches (line 28) | def merge_platform_patches(platform_patches_dir, prepend_patches_dir):
  function _dir_empty (line 52) | def _dir_empty(path):
  function _remove_files_with_dirs (line 65) | def _remove_files_with_dirs(root_dir, sorted_file_iter):
  function unmerge_platform_patches (line 87) | def unmerge_platform_patches(platform_patches_dir):
  function main (line 157) | def main():

FILE: devutils/validate_config.py
  function main (line 34) | def main():

FILE: devutils/validate_patches.py
  class _VerboseRetry (line 39) | class _VerboseRetry(urllib3.util.Retry):
    method sleep_for_retry (line 42) | def sleep_for_retry(self, response=None):
    method _sleep_backoff (line 56) | def _sleep_backoff(self):
  function _get_requests_session (line 61) | def _get_requests_session():
  function _get_requests_session (line 76) | def _get_requests_session():
  class _PatchValidationError (line 85) | class _PatchValidationError(Exception):
  class _UnexpectedSyntaxError (line 89) | class _UnexpectedSyntaxError(RuntimeError):
  class _NotInRepoError (line 93) | class _NotInRepoError(RuntimeError):
  class _DepsNodeVisitor (line 97) | class _DepsNodeVisitor(ast.NodeVisitor):
    method visit_Call (line 102) | def visit_Call(self, node): #pylint: disable=invalid-name
    method generic_visit (line 108) | def generic_visit(self, node):
  function _validate_deps (line 117) | def _validate_deps(deps_text):
  function _deps_var (line 127) | def _deps_var(deps_globals):
  function _parse_deps (line 137) | def _parse_deps(deps_text):
  function _download_googlesource_file (line 145) | def _download_googlesource_file(download_session, repo_url, version, rel...
  function _get_dep_value_url (line 162) | def _get_dep_value_url(deps_globals, dep_value):
  function _process_deps_entries (line 182) | def _process_deps_entries(deps_globals, child_deps_tree, child_path, dep...
  function _get_child_deps_tree (line 210) | def _get_child_deps_tree(download_session, current_deps_tree, child_path...
  function _get_last_chromium_modification (line 224) | def _get_last_chromium_modification():
  function _get_gitiles_git_log_date (line 234) | def _get_gitiles_git_log_date(log_entry):
  function _get_gitiles_commit_before_date (line 239) | def _get_gitiles_commit_before_date(repo_url, target_branch, target_date...
  class _FallbackRepoManager (line 272) | class _FallbackRepoManager:
    method __init__ (line 277) | def __init__(self):
    method gn_version (line 281) | def gn_version(self):
    method get_fallback (line 298) | def get_fallback(self, current_relative_path, current_node, root_deps_...
  function _get_target_file_deps_node (line 318) | def _get_target_file_deps_node(download_session, root_deps_tree, target_...
  function _download_source_file (line 349) | def _download_source_file(download_session, root_deps_tree, fallback_rep...
  function _initialize_deps_tree (line 386) | def _initialize_deps_tree():
  function _retrieve_remote_files (line 406) | def _retrieve_remote_files(file_iter):
  function _retrieve_local_files (line 455) | def _retrieve_local_files(file_iter, source_dir):
  function _modify_file_lines (line 485) | def _modify_file_lines(patched_file, file_lines):
  function _apply_file_unidiff (line 519) | def _apply_file_unidiff(patched_file, files_under_test):
  function _dry_check_patched_file (line 538) | def _dry_check_patched_file(patched_file, orig_file_content):
  function _test_patches (line 554) | def _test_patches(series_iter, patch_cache, files_under_test):
  function _load_all_patches (line 591) | def _load_all_patches(series_iter, patches_dir):
  function _get_required_files (line 611) | def _get_required_files(patch_cache):
  function _get_files_under_test (line 624) | def _get_files_under_test(args, required_files, parser):
  function main (line 644) | def main():

FILE: utils/_common.py
  class PlatformEnum (line 24) | class PlatformEnum(enum.Enum):
  class ExtractorEnum (line 30) | class ExtractorEnum: #pylint: disable=too-few-public-methods
  class SetLogLevel (line 37) | class SetLogLevel(argparse.Action): #pylint: disable=too-few-public-methods
    method __init__ (line 40) | def __init__(self, option_strings, dest, nargs=None, **kwargs):
    method __call__ (line 43) | def __call__(self, parser, namespace, value, option_string=None):
  function get_logger (line 63) | def get_logger(initial_level=logging.INFO):
  function set_logging_level (line 83) | def set_logging_level(logging_level):
  function get_running_platform (line 99) | def get_running_platform():
  function get_chromium_version (line 113) | def get_chromium_version():
  function parse_series (line 118) | def parse_series(series_path):
  function add_common_params (line 135) | def add_common_params(parser):

FILE: utils/_extraction.py
  class ExtractionError (line 25) | class ExtractionError(BaseException):
  function _find_7z_by_registry (line 29) | def _find_7z_by_registry():
  function _find_winrar_by_registry (line 49) | def _find_winrar_by_registry():
  function _find_extractor_by_cmd (line 69) | def _find_extractor_by_cmd(extractor_cmd):
  function _process_relative_to (line 78) | def _process_relative_to(unpack_root, relative_to):
  function _extract_tar_with_7z (line 98) | def _extract_tar_with_7z(binary, archive_path, output_dir, relative_to):
  function _extract_tar_with_tar (line 121) | def _extract_tar_with_tar(binary, archive_path, output_dir, relative_to):
  function _extract_tar_with_winrar (line 136) | def _extract_tar_with_winrar(binary, archive_path, output_dir, relative_...
  function _extract_tar_with_python (line 149) | def _extract_tar_with_python(archive_path, output_dir, relative_to):
  function extract_tar_file (line 200) | def extract_tar_file(archive_path, output_dir, relative_to, extractors=N...
  function extract_with_7z (line 251) | def extract_with_7z(
  function extract_with_winrar (line 297) | def extract_with_winrar(

FILE: utils/domain_substitution.py
  class DomainRegexList (line 38) | class DomainRegexList:
    method __init__ (line 45) | def __init__(self, path):
    method _compile_regex (line 51) | def _compile_regex(self, line):
    method regex_pairs (line 57) | def regex_pairs(self):
    method search_regex (line 66) | def search_regex(self):
  function _substitute_path (line 77) | def _substitute_path(path, regex_iter):
  function _validate_file_index (line 118) | def _validate_file_index(index_file, resolved_tree, cache_index_files):
  function _update_timestamp (line 156) | def _update_timestamp(path: os.PathLike, set_new: bool) -> None:
  function apply_substitution (line 177) | def apply_substitution(regex_path, files_path, source_tree, domainsub_ca...
  function revert_substitution (line 239) | def revert_substitution(domainsub_cache, source_tree):
  function _callback (line 306) | def _callback(args):
  function main (line 314) | def main():

FILE: utils/downloads.py
  class HashesURLEnum (line 32) | class HashesURLEnum(str, enum.Enum):
  class HashMismatchError (line 37) | class HashMismatchError(BaseException):
  class DownloadInfo (line 41) | class DownloadInfo: #pylint: disable=too-few-public-methods
    method _is_hash_url (line 57) | def _is_hash_url(value):
    class _DownloadsProperties (line 75) | class _DownloadsProperties: #pylint: disable=too-few-public-methods
      method __init__ (line 76) | def __init__(self, section_dict, passthrough_properties, hashes):
      method has_hash_url (line 81) | def has_hash_url(self):
      method __getattr__ (line 87) | def __getattr__(self, name):
    method _parse_data (line 101) | def _parse_data(self, path):
    method __init__ (line 125) | def __init__(self, ini_paths):
    method __getitem__ (line 131) | def __getitem__(self, section):
    method __contains__ (line 139) | def __contains__(self, item):
    method __iter__ (line 145) | def __iter__(self):
    method properties_iter (line 149) | def properties_iter(self):
  class _UrlRetrieveReportHook (line 155) | class _UrlRetrieveReportHook: #pylint: disable=too-few-public-methods
    method __init__ (line 158) | def __init__(self):
    method __call__ (line 162) | def __call__(self, block_count, block_size, total_size):
  function _download_via_urllib (line 185) | def _download_via_urllib(url, file_path, show_progress, disable_ssl_veri...
  function _download_if_needed (line 204) | def _download_if_needed(file_path, url, show_progress, disable_ssl_verif...
  function _chromium_hashes_generator (line 245) | def _chromium_hashes_generator(hashes_path):
  function _get_hash_pairs (line 255) | def _get_hash_pairs(download_properties, cache_dir):
  function retrieve_downloads (line 268) | def retrieve_downloads(download_info, cache_dir, show_progress, disable_...
  function check_downloads (line 298) | def check_downloads(download_info, cache_dir):
  function unpack_downloads (line 319) | def unpack_downloads(download_info, cache_dir, output_dir, extractors=No...
  function _add_common_args (line 357) | def _add_common_args(parser):
  function _retrieve_callback (line 368) | def _retrieve_callback(args):
  function _unpack_callback (line 378) | def _unpack_callback(args):
  function main (line 387) | def main():

FILE: utils/filescfg.py
  function filescfg_generator (line 19) | def filescfg_generator(cfg_path, build_outputs, cpu_arch):
  function _get_archive_writer (line 46) | def _get_archive_writer(output_path):
  function create_archive (line 84) | def create_archive(file_iter, include_iter, build_outputs, output_path):
  function _files_generator_by_args (line 100) | def _files_generator_by_args(args):
  function _list_callback (line 115) | def _list_callback(args):
  function _archive_callback (line 120) | def _archive_callback(args):
  function main (line 129) | def main():

FILE: utils/patches.py
  function _find_patch_from_env (line 18) | def _find_patch_from_env():
  function _find_patch_from_which (line 35) | def _find_patch_from_which():
  function find_and_check_patch (line 43) | def find_and_check_patch(patch_bin_path=None):
  function dry_run_check (line 78) | def dry_run_check(patch_path, tree_path, patch_bin_path=None):
  function apply_patches (line 100) | def apply_patches(patch_path_iter, tree_path, reverse=False, patch_bin_p...
  function generate_patches_from_series (line 135) | def generate_patches_from_series(patches_dir, resolve=False):
  function _copy_files (line 144) | def _copy_files(path_iter, source, destination):
  function merge_patches (line 151) | def merge_patches(source_iter, destination, prepend=False):
  function _apply_callback (line 184) | def _apply_callback(args, parser_error):
  function _merge_callback (line 204) | def _merge_callback(args, _):
  function main (line 208) | def main():

FILE: utils/prune_binaries.py
  function prune_dir (line 15) | def prune_dir(unpack_root, prune_files):
  function _callback (line 32) | def _callback(args):
  function main (line 47) | def main():

FILE: utils/tests/test_domain_substitution.py
  function test_update_timestamp (line 14) | def test_update_timestamp():

FILE: utils/tests/test_patches.py
  function test_find_and_check_patch (line 16) | def test_find_and_check_patch():
  function test_patch_from_which (line 27) | def test_patch_from_which():
  function test_patch_from_env (line 32) | def test_patch_from_env():

FILE: utils/third_party/schema.py
  class SchemaError (line 17) | class SchemaError(Exception):
    method __init__ (line 20) | def __init__(self, autos, errors=None):
    method code (line 26) | def code(self):
  class SchemaWrongKeyError (line 46) | class SchemaWrongKeyError(SchemaError):
  class SchemaMissingKeyError (line 52) | class SchemaMissingKeyError(SchemaError):
  class SchemaForbiddenKeyError (line 58) | class SchemaForbiddenKeyError(SchemaError):
  class SchemaUnexpectedTypeError (line 64) | class SchemaUnexpectedTypeError(SchemaError):
  class And (line 70) | class And(object):
    method __init__ (line 74) | def __init__(self, *args, **kw):
    method __repr__ (line 82) | def __repr__(self):
    method validate (line 86) | def validate(self, data):
  class Or (line 100) | class Or(And):
    method validate (line 103) | def validate(self, data):
  class Regex (line 123) | class Regex(object):
    method __init__ (line 131) | def __init__(self, pattern_str, flags=0, error=None):
    method __repr__ (line 144) | def __repr__(self):
    method validate (line 149) | def validate(self, data):
  class Use (line 166) | class Use(object):
    method __init__ (line 171) | def __init__(self, callable_, error=None):
    method __repr__ (line 176) | def __repr__(self):
    method validate (line 179) | def validate(self, data):
  function _priority (line 196) | def _priority(s):
  class Schema (line 212) | class Schema(object):
    method __init__ (line 217) | def __init__(self, schema, error=None, ignore_extra_keys=False):
    method __repr__ (line 222) | def __repr__(self):
    method _dict_key_priority (line 226) | def _dict_key_priority(s):
    method validate (line 234) | def validate(self, data):
  class Optional (line 342) | class Optional(Schema):
    method __init__ (line 346) | def __init__(self, *args, **kwargs):
    method __hash__ (line 359) | def __hash__(self):
    method __eq__ (line 362) | def __eq__(self, other):
  class Forbidden (line 369) | class Forbidden(Schema):
    method __init__ (line 370) | def __init__(self, *args, **kwargs):
  class Const (line 375) | class Const(Schema):
    method validate (line 376) | def validate(self, data):
  function _callable_str (line 381) | def _callable_str(callable_):
Condensed preview — 162 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (3,705K chars).
[
  {
    "path": ".cirrus.yml",
    "chars": 1807,
    "preview": "container:\n    dockerfile: .cirrus_Dockerfile\n\ncode_check_task:\n    pip_cache:\n        folder: ~/.cache/pip\n        fing"
  },
  {
    "path": ".cirrus_Dockerfile",
    "chars": 139,
    "preview": "# Dockerfile for Python 3 with xz-utils (for tar.xz unpacking)\n\nFROM python:3.6-slim\n\nRUN apt update && apt install -y x"
  },
  {
    "path": ".cirrus_requirements.txt",
    "chars": 157,
    "preview": "# Based on Python package versions in Debian buster\nastroid==2.1.0 # via pylint\npylint==2.2.2\npytest-cov==2.6.0\npytest=="
  },
  {
    "path": ".github/ISSUE_TEMPLATE/bugreport.md",
    "chars": 673,
    "preview": "---\nname: Bug report\nabout: Report a bug building or running ungoogled-chromium\ntitle: ''\nlabels: ''\nassignees: ''\n\n---\n"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/create-an--updating-to-chromium-x-x-x-x-.md",
    "chars": 1059,
    "preview": "---\nname: Create an \"Updating to Chromium x.x.x.x\"\nabout: For letting the community track progress to a new stable Chrom"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/feature_request.md",
    "chars": 578,
    "preview": "---\nname: Feature request\nabout: Suggest an idea\ntitle: ''\nlabels: ''\nassignees: ''\n\n---\n\n**Is your feature request rela"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/other.md",
    "chars": 159,
    "preview": "---\nname: Other\nabout: Anything else not listed\ntitle: ''\nlabels: ''\nassignees: ''\n\n---\n\n*IMPORTANT: Please ensure you h"
  },
  {
    "path": ".github/PULL_REQUEST_TEMPLATE.md",
    "chars": 103,
    "preview": "*(Please ensure you have read SUPPORT.md and docs/contributing.md before submitting the Pull Request)*\n"
  },
  {
    "path": ".gitignore",
    "chars": 175,
    "preview": "# Python files\n__pycache__/\n*.py[cod]\n\n# Python testing files\n.coverage\n\n# Ignore macOS Finder meta\n.DS_Store\n.tm_proper"
  },
  {
    "path": ".style.yapf",
    "chars": 182,
    "preview": "[style]\nbased_on_style = pep8\nallow_split_before_dict_value = false\ncoalesce_brackets = true\ncolumn_limit = 100\nindent_w"
  },
  {
    "path": "LICENSE",
    "chars": 1543,
    "preview": "BSD 3-Clause License\n\nCopyright (c) 2015-2020, The ungoogled-chromium Authors\nAll rights reserved.\n\nRedistribution and u"
  },
  {
    "path": "README.md",
    "chars": 10956,
    "preview": "# ungoogled-chromium\n\n*A lightweight approach to removing Google web service dependency*\n\n**Help is welcome!** See the ["
  },
  {
    "path": "SUPPORT.md",
    "chars": 1569,
    "preview": "# Support\n\n**Before you submit feedback, please ensure you have tried the following**: \n\n* Read the [FAQ](//ungoogled-so"
  },
  {
    "path": "chromium_version.txt",
    "chars": 14,
    "preview": "83.0.4103.116\n"
  },
  {
    "path": "devutils/.coveragerc",
    "chars": 498,
    "preview": "[run]\nbranch = True\nparallel = True\nomit = tests/*\n\n[report]\n# Regexes for lines to exclude from consideration\nexclude_l"
  },
  {
    "path": "devutils/README.md",
    "chars": 193,
    "preview": "# Developer utilities for ungoogled-chromium\n\nThis is a collection of scripts written for developing on ungoogled-chromi"
  },
  {
    "path": "devutils/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "devutils/check_all_code.sh",
    "chars": 605,
    "preview": "#!/bin/bash\n\n# Wrapper for devutils and utils formatter, linter, and tester\n\nset -eu\n\n_root_dir=$(dirname $(dirname $(re"
  },
  {
    "path": "devutils/check_downloads_ini.py",
    "chars": 1625,
    "preview": "#!/usr/bin/env python3\n# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved"
  },
  {
    "path": "devutils/check_files_exist.py",
    "chars": 1152,
    "preview": "#!/usr/bin/env python3\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.\n# Use of this source c"
  },
  {
    "path": "devutils/check_gn_flags.py",
    "chars": 2196,
    "preview": "#!/usr/bin/env python3\n# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved"
  },
  {
    "path": "devutils/check_patch_files.py",
    "chars": 4385,
    "preview": "#!/usr/bin/env python3\n# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved"
  },
  {
    "path": "devutils/print_tag_version.sh",
    "chars": 135,
    "preview": "_root_dir=$(dirname $(dirname $(readlink -f $0)))\nprintf '%s-%s' $(cat $_root_dir/chromium_version.txt) $(cat $_root_dir"
  },
  {
    "path": "devutils/pytest.ini",
    "chars": 306,
    "preview": "[pytest]\ntestpaths = tests\n#filterwarnings =\n#\terror\n#\tignore::DeprecationWarning\n#addopts = --cov-report term-missing -"
  },
  {
    "path": "devutils/run_devutils_pylint.py",
    "chars": 1638,
    "preview": "#!/usr/bin/env python3\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.\n# Use of this source c"
  },
  {
    "path": "devutils/run_devutils_tests.sh",
    "chars": 151,
    "preview": "#!/bin/bash\n\nset -eu\n\n_root_dir=$(dirname $(dirname $(readlink -f $0)))\ncd ${_root_dir}/devutils\npython3 -m pytest -c ${"
  },
  {
    "path": "devutils/run_devutils_yapf.sh",
    "chars": 190,
    "preview": "#!/bin/bash\n\nset -eu\n\n_current_dir=$(dirname $(readlink -f $0))\n_root_dir=$(dirname $_current_dir)\npython3 -m yapf --sty"
  },
  {
    "path": "devutils/run_other_pylint.py",
    "chars": 2969,
    "preview": "#!/usr/bin/env python3\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.\n# Use of this source c"
  },
  {
    "path": "devutils/run_other_yapf.sh",
    "chars": 97,
    "preview": "#!/bin/bash\n\nset -eu\n\npython3 -m yapf --style \"$(dirname $(readlink -f $0))/.style.yapf\" -rpi $@\n"
  },
  {
    "path": "devutils/run_utils_pylint.py",
    "chars": 1527,
    "preview": "#!/usr/bin/env python3\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.\n# Use of this source c"
  },
  {
    "path": "devutils/run_utils_tests.sh",
    "chars": 145,
    "preview": "#!/bin/bash\n\nset -eu\n\n_root_dir=$(dirname $(dirname $(readlink -f $0)))\ncd ${_root_dir}/utils\npython3 -m pytest -c ${_ro"
  },
  {
    "path": "devutils/run_utils_yapf.sh",
    "chars": 166,
    "preview": "#!/bin/bash\n\nset -eu\n\n_root_dir=$(dirname $(dirname $(readlink -f $0)))\npython3 -m yapf --style \"$_root_dir/.style.yapf\""
  },
  {
    "path": "devutils/set_quilt_vars.sh",
    "chars": 1626,
    "preview": "# Sets quilt variables for updating the patches\n# Make sure to run this with the shell command \"source\" in order to inhe"
  },
  {
    "path": "devutils/tests/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "devutils/tests/test_check_patch_files.py",
    "chars": 989,
    "preview": "# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2020 The ungoogled-chromium Authors. All rights reserved.\n# Use of this source "
  },
  {
    "path": "devutils/tests/test_validate_patches.py",
    "chars": 2124,
    "preview": "# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2020 The ungoogled-chromium Authors. All rights reserved.\n# Use of this source "
  },
  {
    "path": "devutils/third_party/README.md",
    "chars": 179,
    "preview": "This directory contains third-party libraries used by devutils.\n\nContents:\n\n* [python-unidiff](//github.com/matiasb/pyth"
  },
  {
    "path": "devutils/third_party/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "devutils/third_party/unidiff/__init__.py",
    "chars": 1459,
    "preview": "# -*- coding: utf-8 -*-\n\n# The MIT License (MIT)\n# Copyright (c) 2014-2017 Matias Bordese\n#\n# Permission is hereby grant"
  },
  {
    "path": "devutils/third_party/unidiff/__version__.py",
    "chars": 1170,
    "preview": "# -*- coding: utf-8 -*-\n\n# The MIT License (MIT)\n# Copyright (c) 2014-2017 Matias Bordese\n#\n# Permission is hereby grant"
  },
  {
    "path": "devutils/third_party/unidiff/constants.py",
    "chars": 2202,
    "preview": "# -*- coding: utf-8 -*-\n\n# The MIT License (MIT)\n# Copyright (c) 2014-2017 Matias Bordese\n#\n# Permission is hereby grant"
  },
  {
    "path": "devutils/third_party/unidiff/errors.py",
    "chars": 1335,
    "preview": "# -*- coding: utf-8 -*-\n\n# The MIT License (MIT)\n# Copyright (c) 2014-2017 Matias Bordese\n#\n# Permission is hereby grant"
  },
  {
    "path": "devutils/third_party/unidiff/patch.py",
    "chars": 16415,
    "preview": "# -*- coding: utf-8 -*-\n\n# The MIT License (MIT)\n# Copyright (c) 2014-2017 Matias Bordese\n#\n# Permission is hereby grant"
  },
  {
    "path": "devutils/update_lists.py",
    "chars": 12857,
    "preview": "#!/usr/bin/env python3\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.\n# Use of this source c"
  },
  {
    "path": "devutils/update_platform_patches.py",
    "chars": 6679,
    "preview": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved"
  },
  {
    "path": "devutils/validate_config.py",
    "chars": 1713,
    "preview": "#!/usr/bin/env python3\n# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved"
  },
  {
    "path": "devutils/validate_patches.py",
    "chars": 29315,
    "preview": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n\n# Copyright (c) 2020 The ungoogled-chromium Authors. All rights reserved"
  },
  {
    "path": "docs/building.md",
    "chars": 2800,
    "preview": "# Building ungoogled-chromium\n\nThe recommended way to build ungoogled-chromium is by consulting [the repository for your"
  },
  {
    "path": "docs/contributing.md",
    "chars": 2997,
    "preview": "# Contributing\n\nThis document contains our criteria and guidelines for contributing to ungoogled-chromium.\n\nIf you have "
  },
  {
    "path": "docs/design.md",
    "chars": 7347,
    "preview": "# Design\n\nThis document contains a high-level technical description of ungoogled-chromium and its components.\n\n## Overvi"
  },
  {
    "path": "docs/developing.md",
    "chars": 5410,
    "preview": "# Development notes and procedures\n\nThis document contains an assortment of information for those who want to develop un"
  },
  {
    "path": "docs/flags.md",
    "chars": 3956,
    "preview": "# List of flags and switches\n\nThis is an exhaustive list of command-line switches and `chrome://flags` introduced by ung"
  },
  {
    "path": "docs/platforms.md",
    "chars": 794,
    "preview": "# Supported Platforms\n\nThis page lists platforms officially supported by ungoogled-chromium, and their associated reposi"
  },
  {
    "path": "docs/repo_management.md",
    "chars": 3698,
    "preview": "# Platform Repository Standards and Guidelines\n\n*This document is new, and its structure and content may change. If you "
  },
  {
    "path": "domain_regex.list",
    "chars": 1011,
    "preview": "fonts(\\\\*?)\\.googleapis(\\\\*?)\\.com#f0ntz\\g<1>.9oo91e8p1\\g<2>.qjz9zk\ngoogle([A-Za-z\\-]*?\\\\*?)\\.com(?!mon)#9oo91e\\g<1>.qjz"
  },
  {
    "path": "domain_substitution.list",
    "chars": 740200,
    "preview": ".gn\nBUILD.gn\nPRESUBMIT.py\nPRESUBMIT_test.py\nandroid_webview/browser/aw_browser_context.cc\nandroid_webview/browser/aw_con"
  },
  {
    "path": "downloads.ini",
    "chars": 536,
    "preview": "# Official Chromium source code archive\n# NOTE: Substitutions beginning with underscore are provided by utils\n[chromium]"
  },
  {
    "path": "flags.gn",
    "chars": 617,
    "preview": "clang_use_chrome_plugins=false\nclosure_compile=false\nenable_hangout_services_extension=false\nenable_mdns=false\nenable_ms"
  },
  {
    "path": "patches/core/bromite/disable-fetching-field-trials.patch",
    "chars": 3322,
    "preview": "# NOTE: Modified to remove usage of compiler #if macros\nFrom: csagan5 <32685696+csagan5@users.noreply.github.com>\nDate: "
  },
  {
    "path": "patches/core/debian/disable/unrar.patch",
    "chars": 2954,
    "preview": "description: disable support for safe browsing inspection of rar files\nauthor: Michael Gilbert <mgilbert@debian.org>\nbug"
  },
  {
    "path": "patches/core/inox-patchset/0001-fix-building-without-safebrowsing.patch",
    "chars": 54438,
    "preview": "--- a/chrome/browser/BUILD.gn\n+++ b/chrome/browser/BUILD.gn\n@@ -3180,8 +3180,6 @@ jumbo_static_library(\"browser\") {\n    "
  },
  {
    "path": "patches/core/inox-patchset/0003-disable-autofill-download-manager.patch",
    "chars": 4188,
    "preview": "--- a/components/autofill/core/browser/autofill_download_manager.cc\n+++ b/components/autofill/core/browser/autofill_down"
  },
  {
    "path": "patches/core/inox-patchset/0005-disable-default-extensions.patch",
    "chars": 4208,
    "preview": "--- a/chrome/browser/extensions/component_extensions_whitelist/whitelist.cc\n+++ b/chrome/browser/extensions/component_ex"
  },
  {
    "path": "patches/core/inox-patchset/0007-disable-web-resource-service.patch",
    "chars": 2068,
    "preview": "--- a/components/web_resource/web_resource_service.cc\n+++ b/components/web_resource/web_resource_service.cc\n@@ -120,44 +"
  },
  {
    "path": "patches/core/inox-patchset/0009-disable-google-ipv6-probes.patch",
    "chars": 728,
    "preview": "--- a/net/dns/host_resolver_manager.cc\n+++ b/net/dns/host_resolver_manager.cc\n@@ -130,10 +130,10 @@ const unsigned kMini"
  },
  {
    "path": "patches/core/inox-patchset/0014-disable-translation-lang-fetch.patch",
    "chars": 2570,
    "preview": "--- a/chrome/browser/spellchecker/spellcheck_hunspell_dictionary.cc\n+++ b/chrome/browser/spellchecker/spellcheck_hunspel"
  },
  {
    "path": "patches/core/inox-patchset/0015-disable-update-pings.patch",
    "chars": 334,
    "preview": "--- a/chrome/updater/configurator.cc\n+++ b/chrome/updater/configurator.cc\n@@ -60,7 +60,7 @@ int Configurator::UpdateDela"
  },
  {
    "path": "patches/core/inox-patchset/0017-disable-new-avatar-menu.patch",
    "chars": 434,
    "preview": "--- a/components/signin/internal/identity_manager/primary_account_policy_manager_impl.cc\n+++ b/components/signin/interna"
  },
  {
    "path": "patches/core/inox-patchset/0021-disable-rlz.patch",
    "chars": 425,
    "preview": "# Disable rlz\n\n--- a/rlz/buildflags/buildflags.gni\n+++ b/rlz/buildflags/buildflags.gni\n@@ -6,6 +6,6 @@ import(\"//build/c"
  },
  {
    "path": "patches/core/iridium-browser/all-add-trk-prefixes-to-possibly-evil-connections.patch",
    "chars": 38465,
    "preview": "From cf98027e6f671068371d33e89db26d8bfcd6caff Mon Sep 17 00:00:00 2001\nFrom: Jan Engelhardt <jengelh@inai.de>\nDate: Mon,"
  },
  {
    "path": "patches/core/iridium-browser/safe_browsing-disable-incident-reporting.patch",
    "chars": 3271,
    "preview": "From fe92c640c7e02841dcf5dbc20a5eddbd07fd7edf Mon Sep 17 00:00:00 2001\nFrom: Joachim Bauch <jojo@struktur.de>\nDate: Tue,"
  },
  {
    "path": "patches/core/iridium-browser/safe_browsing-disable-reporting-of-safebrowsing-over.patch",
    "chars": 5244,
    "preview": "From 8f348bf2c249701de2f6049ac57fe346bd6b665f Mon Sep 17 00:00:00 2001\nFrom: Joachim Bauch <jojo@struktur.de>\nDate: Tue,"
  },
  {
    "path": "patches/core/ungoogled-chromium/block-trk-and-subdomains.patch",
    "chars": 9965,
    "preview": "# Block all connection requests with 'qjz9zk' in the domain name or with a 'trk:' scheme.\n# This patch is based on Iridi"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-crash-reporter.patch",
    "chars": 1731,
    "preview": "# Disable some background communication with clients2.google.com\n\n--- a/chrome/browser/tracing/crash_service_uploader.cc"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-domain-reliability.patch",
    "chars": 27672,
    "preview": "# Disable domain reliability component\n\n--- a/components/domain_reliability/BUILD.gn\n+++ b/components/domain_reliability"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-fonts-googleapis-references.patch",
    "chars": 4491,
    "preview": "# Disables references to fonts.googleapis.com\n\n--- a/components/dom_distiller/content/browser/dom_distiller_viewer_sourc"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-gaia.patch",
    "chars": 4426,
    "preview": "# Disables Gaia code\n# Somehow it is still activated even without being signed-in: https://github.com/Eloston/ungoogled-"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-gcm.patch",
    "chars": 2695,
    "preview": "# Disable Google Cloud Messaging (GCM) client\n\n--- a/components/gcm_driver/gcm_client_impl.cc\n+++ b/components/gcm_drive"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-google-host-detection.patch",
    "chars": 22663,
    "preview": "# Disables various detections of Google hosts and functionality specific to them\n\n--- a/chrome/common/google_url_loader_"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-mei-preload.patch",
    "chars": 955,
    "preview": "# Disables use of a binary for preloading the Media Engagement index\n# Said binary is: chrome/browser/resources/media/me"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-network-time-tracker.patch",
    "chars": 693,
    "preview": "# Disable Network Time Tracker\n# This connects to Google to check if the system time is correct when a website certifica"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-profile-avatar-downloading.patch",
    "chars": 724,
    "preview": "# Stop downloading of profile avatar (trk:271:...)\n\n--- a/chrome/browser/profiles/profile_avatar_downloader.cc\n+++ b/chr"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-signin.patch",
    "chars": 1182,
    "preview": "# Disables browser sign-in\n\n--- a/chrome/browser/ui/chrome_pages.cc\n+++ b/chrome/browser/ui/chrome_pages.cc\n@@ -494,22 +"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-translate.patch",
    "chars": 1733,
    "preview": "# Disables browser translation\n\n--- a/components/translate/content/renderer/translate_agent.cc\n+++ b/components/translat"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-untraceable-urls.patch",
    "chars": 4086,
    "preview": "# Disable additional URLs that are not caught by the \"trk\" scheme\n\n--- a/chrome/browser/plugins/plugins_resource_service"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-webrtc-log-uploader.patch",
    "chars": 4608,
    "preview": "# Disables WebRTC log uploading to Google\n\n--- a/chrome/browser/media/webrtc/webrtc_log_uploader.cc\n+++ b/chrome/browser"
  },
  {
    "path": "patches/core/ungoogled-chromium/disable-webstore-urls.patch",
    "chars": 7763,
    "preview": "# Disables Chrome Webstore-related URLs and other internal functionality. Mainly for disabling auto updates via the Chro"
  },
  {
    "path": "patches/core/ungoogled-chromium/fix-building-without-enabling-reporting.patch",
    "chars": 766,
    "preview": "--- a/content/browser/BUILD.gn\n+++ b/content/browser/BUILD.gn\n@@ -1261,6 +1261,8 @@ jumbo_source_set(\"browser\") {\n     \""
  },
  {
    "path": "patches/core/ungoogled-chromium/fix-building-without-one-click-signin.patch",
    "chars": 1923,
    "preview": "# Fix building without one click signin\n\n--- a/chrome/browser/ui/BUILD.gn\n+++ b/chrome/browser/ui/BUILD.gn\n@@ -3622,8 +3"
  },
  {
    "path": "patches/core/ungoogled-chromium/fix-building-without-safebrowsing.patch",
    "chars": 48464,
    "preview": "# Additional changes to Inox's fix-building-without-safebrowsing.patch\n\n--- a/chrome/browser/chrome_content_browser_clie"
  },
  {
    "path": "patches/core/ungoogled-chromium/fix-learn-doubleclick-hsts.patch",
    "chars": 806,
    "preview": "# Split up the learn.doubleclick.net string literal to prevent domain substitution breaking compilation due to the use o"
  },
  {
    "path": "patches/core/ungoogled-chromium/remove-unused-preferences-fields.patch",
    "chars": 236096,
    "preview": "# Remove unused Safe Browsing and Sign-in fields from Preferences file\n# TODO: This patch should probably be split up an"
  },
  {
    "path": "patches/core/ungoogled-chromium/replace-google-search-engine-with-nosearch.patch",
    "chars": 1932,
    "preview": "--- a/components/search_engines/prepopulated_engines.json\n+++ b/components/search_engines/prepopulated_engines.json\n@@ -"
  },
  {
    "path": "patches/core/ungoogled-chromium/use-local-devtools-files.patch",
    "chars": 2742,
    "preview": "# Always use local DevTools files instead of remote files from Google\n# NOTE: This can break Remote Debugging\n# This als"
  },
  {
    "path": "patches/extra/bromite/fingerprinting-flags-client-rects-and-measuretext.patch",
    "chars": 14271,
    "preview": "# Adds two flags:\n# 1. --fingerprinting-client-rects-noise to enable fingerprinting deception for Range::getClientRects "
  },
  {
    "path": "patches/extra/bromite/flag-fingerprinting-canvas-image-data-noise.patch",
    "chars": 13977,
    "preview": "# NOTE: Changes made:\n# * Added flag --fingerprinting-canvas-image-data-noise to enable/disable\n#   Canvas image data fi"
  },
  {
    "path": "patches/extra/bromite/flag-max-connections-per-host.patch",
    "chars": 7372,
    "preview": "From: csagan5 <32685696+csagan5@users.noreply.github.com>\nDate: Sun, 8 Jul 2018 22:42:04 +0200\nSubject: Add flag to conf"
  },
  {
    "path": "patches/extra/debian/disable/android.patch",
    "chars": 506,
    "preview": "description: disable dependency on chrome/android\nauthor: Michael Gilbert <mgilbert@debian.org>\n\n--- a/BUILD.gn\n+++ b/BU"
  },
  {
    "path": "patches/extra/debian/disable/device-notifications.patch",
    "chars": 670,
    "preview": "description: disable device discovery notifications by default\nauthor: Michael Gilbert <mgilbert@debian.org>\nbug-debian:"
  },
  {
    "path": "patches/extra/debian/disable/fuzzers.patch",
    "chars": 1053,
    "preview": "description: fuzzers aren't built, so don't depend on them\nauthor: Michael Gilbert <mgilbert@debian.org>\n\n--- a/BUILD.gn"
  },
  {
    "path": "patches/extra/debian/disable/google-api-warning.patch",
    "chars": 696,
    "preview": "description: disable the google api key warning when those aren't found\nauthor: Michael Gilbert <mgilbert@debian.org>\n\n-"
  },
  {
    "path": "patches/extra/debian/disable/welcome-page.patch",
    "chars": 723,
    "preview": "description: do not override the welcome page setting set in master_preferences\nauthor: Michael Gilbert <mgilbert@debian"
  },
  {
    "path": "patches/extra/debian/fixes/connection-message.patch",
    "chars": 950,
    "preview": "description: suggest proxy misconfiguration when network is unreachable\nauthor: Michael Gilbert <mgilbert@debian.org>\nbu"
  },
  {
    "path": "patches/extra/debian/gn/parallel.patch",
    "chars": 1137,
    "preview": "description: respect specified number of parallel jobs while bootstrapping gn\nauthor: Michael Gilbert <mgilbert@debian.o"
  },
  {
    "path": "patches/extra/debian/warnings/initialization.patch",
    "chars": 548,
    "preview": "description: source_ could be uninitialized\nauthor: Michael Gilbert <mgilbert@debian.org>\n\n--- a/third_party/cacheinvali"
  },
  {
    "path": "patches/extra/inox-patchset/0006-modify-default-prefs.patch",
    "chars": 9652,
    "preview": "\n--- a/chrome/browser/background/background_mode_manager.cc\n+++ b/chrome/browser/background/background_mode_manager.cc\n@"
  },
  {
    "path": "patches/extra/inox-patchset/0008-restore-classic-ntp.patch",
    "chars": 2332,
    "preview": "--- a/chrome/browser/search/search.cc\n+++ b/chrome/browser/search/search.cc\n@@ -183,26 +183,7 @@ struct NewTabURLDetails"
  },
  {
    "path": "patches/extra/inox-patchset/0011-add-duckduckgo-search-engine.patch",
    "chars": 18123,
    "preview": "--- a/components/search_engines/template_url_prepopulate_data.cc\n+++ b/components/search_engines/template_url_prepopulat"
  },
  {
    "path": "patches/extra/inox-patchset/0013-disable-missing-key-warning.patch",
    "chars": 424,
    "preview": "--- a/chrome/browser/ui/startup/google_api_keys_infobar_delegate.cc\n+++ b/chrome/browser/ui/startup/google_api_keys_info"
  },
  {
    "path": "patches/extra/inox-patchset/0016-chromium-sandbox-pie.patch",
    "chars": 299,
    "preview": "--- a/sandbox/linux/BUILD.gn\n+++ b/sandbox/linux/BUILD.gn\n@@ -311,6 +311,12 @@ if (is_linux) {\n       # These files have"
  },
  {
    "path": "patches/extra/inox-patchset/0018-disable-first-run-behaviour.patch",
    "chars": 393,
    "preview": "--- a/chrome/browser/ui/startup/startup_tab_provider.cc\n+++ b/chrome/browser/ui/startup/startup_tab_provider.cc\n@@ -47,7"
  },
  {
    "path": "patches/extra/inox-patchset/0019-disable-battery-status-service.patch",
    "chars": 3520,
    "preview": "--- a/services/device/battery/battery_status_service.cc\n+++ b/services/device/battery/battery_status_service.cc\n@@ -22,1"
  },
  {
    "path": "patches/extra/iridium-browser/Remove-EV-certificates.patch",
    "chars": 1608,
    "preview": "From d32e222a2706cb59f9855b9cf4330f88d1af5435 Mon Sep 17 00:00:00 2001\nFrom: Jan Engelhardt <jengelh@inai.de>\nDate: Thu,"
  },
  {
    "path": "patches/extra/iridium-browser/browser-disable-profile-auto-import-on-first-run.patch",
    "chars": 876,
    "preview": "From 7134d5fd762237ad2d80093b68ccbd1582476640 Mon Sep 17 00:00:00 2001\nFrom: Joachim Bauch <jojo@struktur.de>\nDate: Thu,"
  },
  {
    "path": "patches/extra/iridium-browser/mime_util-force-text-x-suse-ymp-to-be-downloaded.patch",
    "chars": 757,
    "preview": "From d3dcad96b3c2091026c3a81054bb3ce56538a702 Mon Sep 17 00:00:00 2001\nFrom: Jan Engelhardt <jengelh@inai.de>\nDate: Thu,"
  },
  {
    "path": "patches/extra/iridium-browser/net-cert-increase-default-key-length-for-newly-gener.patch",
    "chars": 803,
    "preview": "From 088a50b2fc66418294166b61f31925426b1a9c54 Mon Sep 17 00:00:00 2001\nFrom: Jan Engelhardt <jengelh@inai.de>\nDate: Mon,"
  },
  {
    "path": "patches/extra/iridium-browser/prefs-always-prompt-for-download-directory-by-defaul.patch",
    "chars": 1771,
    "preview": "From 93010fd16c1c9f01a06eab18055bcab54b028cc8 Mon Sep 17 00:00:00 2001\nFrom: Jan Engelhardt <jengelh@inai.de>\nDate: Fri,"
  },
  {
    "path": "patches/extra/iridium-browser/prefs-only-keep-cookies-until-exit.patch",
    "chars": 1037,
    "preview": "From 0839326fb1b7ff7937cee0efa45f5a4ba23c2f78 Mon Sep 17 00:00:00 2001\nFrom: Jan Engelhardt <jengelh@inai.de>\nDate: Sat,"
  },
  {
    "path": "patches/extra/iridium-browser/updater-disable-auto-update.patch",
    "chars": 1247,
    "preview": "From f97af1715c10c5926169ff317ca7c91f1d073af9 Mon Sep 17 00:00:00 2001\nFrom: Jan Engelhardt <jengelh@inai.de>\nDate: Fri,"
  },
  {
    "path": "patches/extra/ungoogled-chromium/add-components-ungoogled.patch",
    "chars": 2355,
    "preview": "# Add ungoogled-chromium-specific code to components/ungoogled/\n\n--- /dev/null\n+++ b/components/ungoogled/BUILD.gn\n@@ -0"
  },
  {
    "path": "patches/extra/ungoogled-chromium/add-flag-for-pdf-plugin-name.patch",
    "chars": 15436,
    "preview": "--- a/chrome/browser/about_flags.cc\n+++ b/chrome/browser/about_flags.cc\n@@ -282,6 +282,12 @@ const FeatureEntry::Choice "
  },
  {
    "path": "patches/extra/ungoogled-chromium/add-flag-for-search-engine-collection.patch",
    "chars": 6655,
    "preview": "# Add flag to disable automatic search engine collection\n\n--- a/chrome/browser/about_flags.cc\n+++ b/chrome/browser/about"
  },
  {
    "path": "patches/extra/ungoogled-chromium/add-flag-to-configure-extension-downloading.patch",
    "chars": 5172,
    "preview": "# Add extension-mime-request-handling chrome://flag to tweak the behavior of\n# extension MIME types\n\n--- a/chrome/browse"
  },
  {
    "path": "patches/extra/ungoogled-chromium/add-flag-to-disable-beforeunload.patch",
    "chars": 934,
    "preview": "# Add --disable-beforeunload to always disable beforeunload JavaScript dialogs\n\n--- a/components/javascript_dialogs/app_"
  },
  {
    "path": "patches/extra/ungoogled-chromium/add-flag-to-force-punycode-hostnames.patch",
    "chars": 1699,
    "preview": "# Add flag to force punycode in hostnames instead of Unicode when displaying Internationalized Domain Names (IDNs) to mi"
  },
  {
    "path": "patches/extra/ungoogled-chromium/add-flag-to-hide-crashed-bubble.patch",
    "chars": 679,
    "preview": "# Add flag --hide-crashed-bubble to hide the bubble box:\n# \"Restore Pages? Chromium didn't shut down correctly.\"\n\n--- a/"
  },
  {
    "path": "patches/extra/ungoogled-chromium/add-flag-to-scroll-tabs.patch",
    "chars": 3217,
    "preview": "--- a/chrome/browser/about_flags.cc\n+++ b/chrome/browser/about_flags.cc\n@@ -272,6 +272,16 @@ const FeatureEntry::Choice "
  },
  {
    "path": "patches/extra/ungoogled-chromium/add-flag-to-show-avatar-button.patch",
    "chars": 2640,
    "preview": "--- a/chrome/browser/about_flags.cc\n+++ b/chrome/browser/about_flags.cc\n@@ -259,6 +259,19 @@ const FeatureEntry::Choice "
  },
  {
    "path": "patches/extra/ungoogled-chromium/add-flag-to-stack-tabs.patch",
    "chars": 1608,
    "preview": "# Add --enable-stacked-tab-strip and --enable-tab-adjust-layout flags to tweak tab strip behavior\n\n--- a/chrome/browser/"
  },
  {
    "path": "patches/extra/ungoogled-chromium/add-ipv6-probing-option.patch",
    "chars": 1392,
    "preview": "# Disables IPv6 probing and adds an option to change the IPv6 probing result\n# TODO: Consider adding a chrome://flag to "
  },
  {
    "path": "patches/extra/ungoogled-chromium/add-suggestions-url-field.patch",
    "chars": 20263,
    "preview": "# Add suggestions URL text field to the search engine editing dialog\n# (chrome://settings/searchEngines).\n\n--- a/chrome/"
  },
  {
    "path": "patches/extra/ungoogled-chromium/default-to-https-scheme.patch",
    "chars": 3789,
    "preview": "# Default to https for (non-standard) URLs without scheme.\n#\n# This patch handles URLs like user:pass@example.com and tr"
  },
  {
    "path": "patches/extra/ungoogled-chromium/disable-download-quarantine.patch",
    "chars": 9280,
    "preview": "# Disables file download quarantining\n\n--- a/components/download/internal/common/base_file.cc\n+++ b/components/download/"
  },
  {
    "path": "patches/extra/ungoogled-chromium/disable-formatting-in-omnibox.patch",
    "chars": 1317,
    "preview": "# Disables omission of URL elements in Omnibox\n\n--- a/components/url_formatter/url_formatter.cc\n+++ b/components/url_for"
  },
  {
    "path": "patches/extra/ungoogled-chromium/disable-intranet-redirect-detector.patch",
    "chars": 704,
    "preview": "# Disables the intranet redirect detector. It generates extra DNS requests and the functionality using this is disabled\n"
  },
  {
    "path": "patches/extra/ungoogled-chromium/disable-webgl-renderer-info.patch",
    "chars": 1331,
    "preview": "# Return blank values for WebGLDebugRendererInfo to remove a potential data\n# leak while preventing potential website br"
  },
  {
    "path": "patches/extra/ungoogled-chromium/enable-checkbox-external-protocol.patch",
    "chars": 615,
    "preview": "# Return \"Always open links of this type in the associated app\" checkbox.\n\n--- a/chrome/browser/ui/browser_ui_prefs.cc\n+"
  },
  {
    "path": "patches/extra/ungoogled-chromium/enable-page-saving-on-more-pages.patch",
    "chars": 2547,
    "preview": "# Add more URL schemes allowed for saving\n\n--- a/chrome/browser/ui/browser_commands.cc\n+++ b/chrome/browser/ui/browser_c"
  },
  {
    "path": "patches/extra/ungoogled-chromium/enable-paste-and-go-new-tab-button.patch",
    "chars": 1398,
    "preview": "--- a/chrome/browser/ui/views/tabs/new_tab_button.cc\n+++ b/chrome/browser/ui/views/tabs/new_tab_button.cc\n@@ -84,10 +84,"
  },
  {
    "path": "patches/extra/ungoogled-chromium/fix-building-without-mdns-and-service-discovery.patch",
    "chars": 2116,
    "preview": "# Fix building with enable_service_discovery=false and enable_mds=false\n\n--- a/chrome/browser/media/router/discovery/mdn"
  },
  {
    "path": "patches/extra/ungoogled-chromium/popups-to-tabs.patch",
    "chars": 547,
    "preview": "# Make popups go to tabs instead\n\n--- a/content/renderer/render_view_impl.cc\n+++ b/content/renderer/render_view_impl.cc\n"
  },
  {
    "path": "patches/extra/ungoogled-chromium/remove-disable-setuid-sandbox-as-bad-flag.patch",
    "chars": 575,
    "preview": "# Remove the \"--disable-setuid-sandbox\" command line flag as a bad flag\n\n--- a/chrome/browser/ui/startup/bad_flags_promp"
  },
  {
    "path": "patches/extra/ungoogled-chromium/searx.patch",
    "chars": 21151,
    "preview": "# Add searx\n# This will become the default engine if no other engine is already set\n\n--- a/components/search_engines/pre"
  },
  {
    "path": "patches/series",
    "chars": 4929,
    "preview": "core/inox-patchset/0001-fix-building-without-safebrowsing.patch\ncore/inox-patchset/0003-disable-autofill-download-manage"
  },
  {
    "path": "pruning.list",
    "chars": 1948681,
    "preview": "android_webview/test/shell/assets/star.svgz\nandroid_webview/test/shell/assets/video.3gp\nandroid_webview/test/shell/asset"
  },
  {
    "path": "revision.txt",
    "chars": 2,
    "preview": "1\n"
  },
  {
    "path": "utils/.coveragerc",
    "chars": 498,
    "preview": "[run]\nbranch = True\nparallel = True\nomit = tests/*\n\n[report]\n# Regexes for lines to exclude from consideration\nexclude_l"
  },
  {
    "path": "utils/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "utils/_common.py",
    "chars": 4817,
    "preview": "# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2020 The ungoogled-chromium Authors. All rights reserved.\n# Use of this source "
  },
  {
    "path": "utils/_extraction.py",
    "chars": 14167,
    "preview": "# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.\n# Use of this source "
  },
  {
    "path": "utils/domain_substitution.py",
    "chars": 14419,
    "preview": "#!/usr/bin/env python3\n# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved"
  },
  {
    "path": "utils/downloads.py",
    "chars": 18265,
    "preview": "#!/usr/bin/env python3\n# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved"
  },
  {
    "path": "utils/filescfg.py",
    "chars": 7153,
    "preview": "#!/usr/bin/env python3\n# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved"
  },
  {
    "path": "utils/patches.py",
    "chars": 9526,
    "preview": "#!/usr/bin/env python3\n# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2020 The ungoogled-chromium Authors. All rights reserved"
  },
  {
    "path": "utils/prune_binaries.py",
    "chars": 2060,
    "preview": "#!/usr/bin/env python3\n# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved"
  },
  {
    "path": "utils/pytest.ini",
    "chars": 247,
    "preview": "[pytest]\ntestpaths = tests\n#filterwarnings =\n#\terror\n#\tignore::DeprecationWarning\n#addopts = --cov-report term-missing -"
  },
  {
    "path": "utils/tests/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "utils/tests/test_domain_substitution.py",
    "chars": 1230,
    "preview": "# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2019 The ungoogled-chromium Authors. All rights reserved.\n# Use of this source "
  },
  {
    "path": "utils/tests/test_patches.py",
    "chars": 1155,
    "preview": "# -*- coding: UTF-8 -*-\n\n# Copyright (c) 2020 The ungoogled-chromium Authors. All rights reserved.\n# Use of this source "
  },
  {
    "path": "utils/third_party/README.md",
    "chars": 183,
    "preview": "This directory contains third-party libraries used by build utilities.\n\nContents:\n\n* [schema](//github.com/keleshev/sche"
  },
  {
    "path": "utils/third_party/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "utils/third_party/schema.py",
    "chars": 14255,
    "preview": "\"\"\"schema is a library for validating Python data structures, such as those\nobtained from config-files, forms, external "
  }
]

About this extraction

This page contains the full source code of the BoldBrowser/bold-browser GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 162 files (3.5 MB), approximately 915.9k tokens, and a symbol index with 245 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!