Full Code of dortania/build-repo for AI

github-actions 74c2f09db21f cached
19 files
58.3 KB
15.2k tokens
21 symbols
1 requests
Download .txt
Repository: dortania/build-repo
Branch: github-actions
Commit: 74c2f09db21f
Files: 19
Total size: 58.3 KB

Directory structure:
gitextract_q33h6l_2/

├── .dccache
├── .flake8
├── .github/
│   └── workflows/
│       └── workflow.yaml
├── .gitignore
├── .pylintrc
├── README.md
├── add.py
├── builder.py
├── check_ratelimit.py
├── config_mgmt.py
├── downloader.py
├── local-test.sh
├── notify.py
├── parallel_check.py
├── plugins.json
├── requirements.txt
├── sort_plugins.py
├── update_config.py
└── updater.py

================================================
FILE CONTENTS
================================================

================================================
FILE: .dccache
================================================
[{"/Users/dhinak/Documents/GitHub/build-repo/.pylintrc":"1","/Users/dhinak/Documents/GitHub/build-repo/add.py":"2","/Users/dhinak/Documents/GitHub/build-repo/builder.py":"3","/Users/dhinak/Documents/GitHub/build-repo/check_ratelimit.py":"4","/Users/dhinak/Documents/GitHub/build-repo/downloader.py":"5","/Users/dhinak/Documents/GitHub/build-repo/import_old.py":"6","/Users/dhinak/Documents/GitHub/build-repo/parallel_check.py":"7","/Users/dhinak/Documents/GitHub/build-repo/sort_plugins.py":"8","/Users/dhinak/Documents/GitHub/build-repo/test_release.py":"9","/Users/dhinak/Documents/GitHub/build-repo/update_config.py":"10","/Users/dhinak/Documents/GitHub/build-repo/updater.py":"11"},[359,1609043345880.1628,"12"],[6382,1609043323409.1133,"13"],[12077,1609043323410.3127,"14"],[217,1609043323410.6653,"15"],[2751,1609043323411.064,"16"],[6461,1609043323411.679,"17"],[1378,1609043323412.0698,"18"],[213,1609043323412.326,"19"],[1799,1609043323412.6558,"20"],[5805,1609043323413.2163,"21"],[6626,1609043323413.8252,"22"],"3317598182cbaacada866694f5ad412226670324676a10d4bdba37ba91d7ab1b","2b6989899fa7539a43eeaf2785f6fe515ae8cc6ad8c372c5a047c49349d8d272","4057a856d5f40a267ef5323de9ceb7027e2f6ca3dacfb7bbe6926885d2044061","edd765034a7f1491508d2062beb6246d73f34f5e51fa5bfa2c9797406f119bfb","505a3205046744882bf07457d8f7aded0d6b2920c5dff7257a5ba9267b89ba18","8d8980ea29ddae82216bafcc0f6a219211a34fd44b3e1a5031809f9c8668b99e","a204565775c3bb211df6792461cef60ac7156cdb368069477efca57696f49346","8ce81cbb63bf426bd2898a084f470805a2ed8e7f6441242900e3432f44bac717","88d36be415255600d3441e6296e11d0c8a2f654fd67aa05196a4b7b6b1b07e62","8c2148e850e5134996a936325d1420540983d24f3cef72d726485d4c07f7647c","e66576fc1a1ff4297d34c401c17e51c4dca9caca3192af4ff0baf001be7801f4"]

================================================
FILE: .flake8
================================================
[flake8]
extend-ignore = E501, E203

================================================
FILE: .github/workflows/workflow.yaml
================================================
name: Build
on:
  push:
  schedule:
    - cron: '*/5 * * * *'
  workflow_dispatch:
env:
  FORCE_INSTALL: 1
  HAS_OPENSSL_BUILD: 1
  HAS_OPENSSL_W32BUILD: 0
  ACID32: 1
  HOMEBREW_NO_INSTALL_CLEANUP: 1
  HOMEBREW_NO_AUTO_UPDATE: 1
  PROD: ${{ github.ref == 'refs/heads/github-actions' }}
concurrency:
  group: ${{ github.workflow }}-${{ github.ref }}
jobs:
  build:
    runs-on: m1_monterey
    steps:
      - name: Checkout Repository
        uses: actions/checkout@v4
        # with:
        #   ref: github-actions
      - name: Set up Python 3
        run: brew install python3 python-tk
#       - uses: actions/setup-python@v4
#         with:
#           python-version: '3.10'
#           cache: pip
      - name: Install Python Dependencies
        run: |
          python3 -m pip install -U pip wheel
          python3 -m pip install hammock python-dateutil datetime termcolor2 purl python-magic humanize gitpython cryptography macholib
          echo "OVERRIDE_PYTHON3=$(which python3)" >> "$GITHUB_ENV"
      # - name: Check Parallel
      #   run: python3 -u parallel_check.py ${{ secrets.GITHUB_TOKEN }}
      - name: Install Build Dependencies
        run: | # Needing for VoodooI2C to build without actually having cldoc & cpplint
          brew tap FiloSottile/homebrew-musl-cross
          brew install libmagic mingw-w64 openssl musl-cross
          mkdir wrappers
          printf "#!/bin/bash\nexit 0" > wrappers/cldoc
          printf "#!/bin/bash\nexit 0" > wrappers/cpplint
          chmod +x wrappers/cldoc wrappers/cpplint
          echo "$(readlink -f wrappers)" >> "$GITHUB_PATH"
      - uses: fregante/setup-git-user@2e28d51939d2a84005a917d2f844090637f435f8
      - name: Set Up Working Tree
        uses: actions/checkout@v4
        with:
          ref: builds
          path: Config
      - name: Check Ratelimit
        run: python3 -u check_ratelimit.py ${{ secrets.GITHUB_TOKEN }}
      - name: Run Builder
        run: python3 -u updater.py ${{ secrets.GITHUB_TOKEN }} ${{ secrets.WEBHOOK_URL }} ${{ secrets.PAYLOAD_KEY }}
        env:
          JOB_NAME: ${{ github.job }}
      - name: Check Ratelimit
        run: python3 -u check_ratelimit.py ${{ secrets.GITHUB_TOKEN }}
      - name: Upload Artifact
        uses: actions/upload-artifact@v4
        if: ${{ env.PROD == 'false' }}
        with:
          name: Build
          path: Config


================================================
FILE: .gitignore
================================================
gh token.txt
Lilu-and-Friends/
__pycache__/
Builds/
Config/
Temp/
.vscode/
.DS_Store

================================================
FILE: .pylintrc
================================================
[MASTER]

init-hook="from pylint.config import find_pylintrc; import os, sys; sys.path.append(os.path.dirname(find_pylintrc()))"

[MESSAGES CONTROL]

disable=unused-import,
        subprocess-run-check,
        line-too-long,
        too-few-public-methods,
        missing-module-docstring,
        missing-class-docstring,
        missing-function-docstring

================================================
FILE: README.md
================================================
# build-repo

![Build](https://github.com/dortania/build-repo/workflows/Build/badge.svg)

Credit CorpNewt for Lilu & Friends, where some functions originate from and the inspiration for this project.

================================================
FILE: add.py
================================================
import datetime
import hashlib
import json
import os
import time
from pathlib import Path

import dateutil.parser
import git
import magic
import purl
from hammock import Hammock as hammock

from config_mgmt import save_config

mime = magic.Magic(mime=True)


def hash_file(file_path: Path):
    return hashlib.sha256(file_path.read_bytes()).hexdigest()


def expand_globs(str_path: str):
    path = Path(str_path)
    parts = path.parts[1:] if path.is_absolute() else path.parts
    return list(Path(path.root).glob(str(Path("").joinpath(*parts))))


def upload_release_asset(release_id, token, file_path: Path):
    upload_url = hammock("https://api.github.com/repos/dortania/build-repo/releases/" + str(release_id), auth=("github-actions", token)).GET().json()
    try:
        upload_url = upload_url["upload_url"]
    except Exception:
        print(upload_url)
        raise
    mime_type = mime.from_file(str(file_path.resolve()))
    if not mime_type[0]:
        print("Failed to guess mime type!")
        raise RuntimeError

    asset_upload = hammock(str(purl.Template(upload_url).expand({"name": file_path.name, "label": file_path.name})), auth=("github-actions", token)).POST(
        data=file_path.read_bytes(),
        headers={"content-type": mime_type}
    )
    return asset_upload.json()["browser_download_url"]


def paginate(url, token):
    url = hammock(url, auth=("github-actions", token)).GET()
    if url.links == {}:
        return url.json()
    else:
        container = url.json()
        while url.links.get("next"):
            url = hammock(url.links["next"]["url"], auth=("github-actions", token)).GET()
            container += url.json()
        return container


def add_built(plugin, token):
    plugin_info = plugin["plugin"]
    commit_info = plugin["commit"]
    files = plugin["files"]

    script_dir = Path(__file__).parent.absolute()
    config_path = script_dir / Path("Config/config.json")
    config_path.touch()
    config = json.load(config_path.open())

    name = plugin_info["Name"]
    plugin_type = plugin_info.get("Type", "Kext")

    ind = None

    if not config.get(name, None):
        config[name] = {}
    if not config[name].get("type", None):
        config[name]["type"] = plugin_type
    if not config[name].get("versions", None):
        config[name]["versions"] = []

    release = {}
    if config[name]["versions"]:
        config[name]["versions"] = [i for i in config[name]["versions"] if not (i.get("commit", {}).get("sha", None) == commit_info["sha"])]

    release["commit"] = {"sha": commit_info["sha"], "message": commit_info["commit"]["message"], "url": commit_info["html_url"], "tree_url": commit_info["html_url"].replace("/commit/", "/tree/")}
    release["version"] = files["version"]
    release["date_built"] = datetime.datetime.now(tz=datetime.timezone.utc).isoformat()
    release["date_committed"] = dateutil.parser.parse(commit_info["commit"]["committer"]["date"]).isoformat()
    release["date_authored"] = dateutil.parser.parse(commit_info["commit"]["author"]["date"]).isoformat()
    release["source"] = "built"

    if os.environ.get("PROD", "false") == "true":
        releases_url = hammock("https://api.github.com/repos/dortania/build-repo/releases", auth=("github-actions", token))

        # Delete previous releases
        for i in paginate("https://api.github.com/repos/dortania/build-repo/releases", token):
            if i["name"] == (name + " " + release["commit"]["sha"][:7]):
                print("\tDeleting previous release...")
                releases_url(i["id"]).DELETE()
                time.sleep(3)  # Prevent race conditions

        # Delete tags
        check_tag = hammock("https://api.github.com/repos/dortania/build-repo/git/refs/tags/" + name + "-" + release["commit"]["sha"][:7], auth=("github-actions", token))
        if check_tag.GET().status_code != 404:
            print("\tDeleting previous tag...")
            check_tag.DELETE()
            time.sleep(3)  # Prevent race conditions

        # Create release
        create_release = releases_url.POST(json={
            "tag_name": name + "-" + release["commit"]["sha"][:7],
            "target_commitish": "builds",
            "name": name + " " + release["commit"]["sha"][:7]
        })
        # print(create_release.json()["id"])
        release["release"] = {"id": create_release.json()["id"], "url": create_release.json()["html_url"]}

    if not release.get("hashes", None):
        release["hashes"] = {"debug": {"sha256": ""}, "release": {"sha256": ""}}

    release["hashes"]["debug"] = {"sha256": hash_file(files["debug"])}
    release["hashes"]["release"] = {"sha256": hash_file(files["release"])}

    if files["extras"]:
        for file in files["extras"]:
            release["hashes"][file.name] = {"sha256": hash_file(file)}

    if os.environ.get("PROD", "false") == "true":
        if not release.get("links", None):
            release["links"] = {}

        for i in ["debug", "release"]:
            release["links"][i] = upload_release_asset(release["release"]["id"], token, files[i])

        if files["extras"]:
            if not release.get("extras", None):
                release["extras"] = {}
            for file in files["extras"]:
                release["extras"][file.name] = upload_release_asset(release["release"]["id"], token, file)
        new_line = "\n"  # No escapes in f-strings

        release["release"]["description"] = f"""**Changes:**
{release['commit']['message'].strip()}
[View on GitHub]({release['commit']['url']}) ([browse tree]({release['commit']['tree_url']}))

**Hashes**:
**Debug:**
{files["debug"].name + ': ' + release['hashes']['debug']["sha256"]}
**Release:**
{files["release"].name + ': ' + release['hashes']['release']["sha256"]}
{'**Extras:**' if files["extras"] else ''}
{new_line.join([(file.name + ': ' + release['hashes'][file.name]['sha256']) for file in files["extras"]]) if files["extras"] else ''}
""".strip()

        hammock("https://api.github.com/repos/dortania/build-repo/releases/" + str(release["release"]["id"]), auth=("github-actions", token)).POST(json={
            "body": release["release"]["description"]
        })

    config[name]["versions"].insert(0, release)
    config[name]["versions"].sort(key=lambda x: (x["date_committed"], x["date_authored"]), reverse=True)
    save_config(config)

    if os.environ.get("PROD", "false") == "true":
        repo = git.Repo(script_dir / Path("Config"))
        repo.git.add(all=True)
        repo.git.commit(message="Deploying to builds")
        repo.git.push()

    return release


================================================
FILE: builder.py
================================================
import io
import plistlib
import shutil
import stat
import subprocess
import zipfile
from os import chdir
from pathlib import Path

from hammock import Hammock as hammock


class Builder:
    def __init__(self):
        self.lilu = {}
        self.clang32 = None
        self.edk2 = None
        self.script_dir = Path(__file__).parent.absolute()

        self.working_dir = self.script_dir / Path("Temp")
        if self.working_dir.exists():
            shutil.rmtree(self.working_dir)
        self.working_dir.mkdir()

        self.build_dir = self.script_dir / Path("Builds")
        if self.build_dir.exists():
            shutil.rmtree(self.build_dir)
        self.build_dir.mkdir()

    @staticmethod
    def _expand_globs(p: str):
        if "*" in p:
            path = Path(p)
            parts = path.parts[1:] if path.is_absolute() else path.parts
            return list(Path(path.root).glob(str(Path("").joinpath(*parts))))
        else:
            return [Path(p)]

    def _bootstrap_clang32(self, target_dir: Path):
        chdir(self.working_dir)
        clang_dir = self.working_dir / Path("clang32")

        if not self.clang32:
            print("Bootstrapping prerequisite: clang32...")
            if clang_dir.exists():
                shutil.rmtree(clang_dir)
            clang_dir.mkdir()
            chdir(clang_dir)
            print("\tDownloading clang32 binary...")
            zipfile.ZipFile(io.BytesIO(hammock("https://github.com/acidanthera/ocbuild/releases/download/llvm-kext32-latest/clang-12.zip").GET().content)).extractall()
            (clang_dir / Path("clang-12")).chmod((clang_dir / Path("clang-12")).stat().st_mode | stat.S_IEXEC)

            print("\tDownloading clang32 scripts...")
            for tool in ["fix-macho32", "libtool32"]:
                tool_path = Path(tool)
                tool_path.write_bytes(hammock(f"https://raw.githubusercontent.com/acidanthera/ocbuild/master/scripts/{tool}").GET().content)
                tool_path.chmod(tool_path.stat().st_mode | stat.S_IEXEC)
            self.clang32 = clang_dir.resolve()
        (target_dir / Path("clang32")).symlink_to(self.clang32)

    def _bootstrap_edk2(self):
        chdir(self.working_dir)
        if not self.edk2:
            print("Bootstrapping prerequisite: EDK II...")
            if Path("edk2").exists():
                shutil.rmtree(Path("edk2"))
            print("\tCloning the repo...")
            result = subprocess.run("git clone https://github.com/acidanthera/audk edk2 --branch master --depth 1".split(), stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
            if result.returncode != 0:
                print("\tClone failed!")
                print(result.stdout.decode())
                return False
            self.edk2 = True

    def _build_lilu(self):
        chdir(self.working_dir)
        if not self.lilu:
            print("Building prerequiste: Lilu...")
            if Path("Lilu").exists():
                shutil.rmtree(Path("Lilu"))
            print("\tCloning the repo...")
            result = subprocess.run("git clone https://github.com/acidanthera/Lilu.git".split(), stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
            if result.returncode != 0:
                print("\tClone failed!")
                print(result.stdout.decode())
                return False
            chdir(self.working_dir / Path("Lilu"))
            print("\tCloning MacKernelSDK...")
            result = subprocess.run("git clone https://github.com/acidanthera/MacKernelSDK.git".split(), stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
            if result.returncode != 0:
                print("\tClone of MacKernelSDK failed!")
                print(result.stdout.decode())
                return False
            self._bootstrap_clang32(self.working_dir / Path("Lilu"))
            chdir(self.working_dir / Path("Lilu"))
            print("\tBuilding debug version...")
            result = subprocess.run("xcodebuild -quiet -configuration Debug".split(), stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
            if result.returncode != 0:
                print("\tBuild failed!")
                print(result.stdout.decode())
                return False
            result = subprocess.run("git rev-parse HEAD".split(), stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
            if result.returncode != 0:
                print("\tObtaining commit hash failed!")
                print(result.stdout.decode())
                return False
            else:
                commithash = result.stdout.decode().strip()
            shutil.copytree(Path("build/Debug/Lilu.kext"), self.working_dir / Path("Lilu.kext"))
            self.lilu = [commithash, self.working_dir / Path("Lilu.kext")]
        return self.lilu[1]

    def build(self, plugin, commithash=None):
        name = plugin["Name"]
        url = plugin["URL"]
        needs_lilu = plugin.get("Lilu", False)
        needs_mackernelsdk = plugin.get("MacKernelSDK", False)
        fat = plugin.get("32-bit", False)
        edk2 = plugin.get("EDK II", False)
        command = plugin.get("Command")
        prebuild = plugin.get("Pre-Build", [])
        postbuild = plugin.get("Post-Build", [])
        build_opts = plugin.get("Build Opts", [])
        build_dir = plugin.get("Build Dir", "build/")
        p_info = plugin.get("Info", f"{build_dir}Release/{name}.kext/Contents/Info.plist")
        b_type = plugin.get("Type", "Kext")
        d_file = plugin.get("Debug File", f"{build_dir}Debug/*.kext")
        r_file = plugin.get("Release File", f"{build_dir}Release/*.kext")
        extra_files = plugin.get("Extras", None)
        v_cmd = plugin.get("Version", None)

        chdir(self.working_dir)

        if needs_lilu:
            if not self._build_lilu():
                print("Building of prerequiste: Lilu failed!")
                return False

        chdir(self.working_dir)
        print("Building " + name + "...")
        if Path(name).exists():
            shutil.rmtree(Path(name))
        print("\tCloning the repo...")
        result = subprocess.run(["git", "clone", "--recurse-submodules", url + ".git", name], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
        if result.returncode != 0:
            print("\tClone failed!")
            print(result.stdout.decode())
            return False
        chdir(self.working_dir / Path(name))

        if commithash:
            print("\tChecking out to " + commithash + "...")
            result = subprocess.run(["git", "checkout", commithash], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
            if result.returncode != 0:
                print("\tCheckout failed!")
                print(result.stdout.decode())
                return False
        else:
            result = subprocess.run("git rev-parse HEAD".split(), stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
            if result.returncode != 0:
                print("\tObtaining commit hash failed!")
                print(result.stdout.decode())
                return False
            else:
                commithash = result.stdout.decode().strip()
        chdir(self.working_dir / Path(name))

        if needs_lilu:
            lilu_path = self._build_lilu()
            if not lilu_path:
                print("Building of prerequiste: Lilu failed!")
                return False
            shutil.copytree(lilu_path, self.working_dir / Path(name) / Path("Lilu.kext"))

        chdir(self.working_dir / Path(name))
        if needs_mackernelsdk:
            print("\tCloning MacKernelSDK...")
            result = subprocess.run("git clone https://github.com/acidanthera/MacKernelSDK.git".split(), stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
            if result.returncode != 0:
                print("\tClone of MacKernelSDK failed!")
                print(result.stdout.decode())
                return False

        chdir(self.working_dir / Path(name))
        if fat:
            self._bootstrap_clang32(self.working_dir / Path(name))
            build_opts += ["-arch", "x86_64", "-arch", "ACID32"]

        chdir(self.working_dir / Path(name))
        if edk2:
            self._bootstrap_edk2()

        chdir(self.working_dir / Path(name))
        if prebuild:
            print("\tRunning prebuild tasks...")
            for task in prebuild:
                print("\t\tRunning task '" + task["name"] + "'")
                args = [task["path"]]
                args.extend(task["args"])
                result = subprocess.run(args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
                if result.returncode != 0:
                    print("\t\tTask failed!")
                    print(result.stdout.decode())
                    return False
                else:
                    print("\t\tTask completed.")
        chdir(self.working_dir / Path(name))
        if isinstance(command, str) or (isinstance(command, list) and all(isinstance(n, str) for n in command)):
            print("\tBuilding...")
            if isinstance(command, str):
                command = command.split()
            result = subprocess.run(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
            if result.returncode != 0:
                print("\tBuild failed!")
                print(result.stdout.decode())
                print("\tReturn code: " + str(result.returncode))
                return False
        elif isinstance(command, list) and all(isinstance(n, dict) for n in command):
            # Multiple commands
            for i in command:
                print("\t" + i["name"] + "...")
                result = subprocess.run([i["path"]] + i["args"], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
                if result.returncode != 0:
                    print("\tCommand failed!")
                    print(result.stdout.decode())
                    print("\tReturn code: " + str(result.returncode))
                    return False
        else:
            print("\tBuilding release version...")
            args = "xcodebuild -quiet -configuration Release".split()
            args += build_opts
            args += ["-jobs", "1"]
            # BUILD_DIR should only be added if we don't have scheme. Otherwise, use -derivedDataPath
            args += ["-derivedDataPath", "build"] if "-scheme" in build_opts else ["BUILD_DIR=build/"]

            result = subprocess.run(args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
            if result.returncode != 0:
                print("\tBuild failed!")
                print(result.stdout.decode())
                print("\tReturn code: " + str(result.returncode))
                return False

            print("\tBuilding debug version...")
            args = "xcodebuild -quiet -configuration Debug".split()
            args += build_opts
            args += ["-jobs", "1"]
            # BUILD_DIR should only be added if we don't have scheme. Otherwise, use -derivedDataPath
            args += ["-derivedDataPath", "build"] if "-scheme" in build_opts else ["BUILD_DIR=build/"]

            result = subprocess.run(args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
            if result.returncode != 0:
                print("\tBuild failed!")
                print(result.stdout.decode())
                print("\tReturn code: " + str(result.returncode))
                return False
        chdir(self.working_dir / Path(name))
        if postbuild:
            print("\tRunning postbuild tasks...")
            for task in postbuild:
                print("\t\tRunning task '" + task["name"] + "'")
                args = [task["path"]]
                args.extend(task["args"])
                result = subprocess.run(args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, cwd=task.get("cwd", None))
                if result.returncode != 0:
                    print("\t\tTask failed!")
                    print(result.stdout.decode())
                    return False
                else:
                    print("\t\tTask completed.")
        chdir(self.working_dir / Path(name))
        if v_cmd:
            if isinstance(v_cmd, str):
                v_cmd = v_cmd.split()
            result = subprocess.run(v_cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
            if result.returncode != 0:
                print("\tRunning version command failed!")
                print(result.stdout.decode())
                return False
            else:
                version = result.stdout.decode().strip()
        elif b_type == "Kext":
            plistpath = Path(p_info)
            version = plistlib.load(plistpath.open(mode="rb"))["CFBundleVersion"]
        else:
            print("\tNo version command!")
            return False
        print("\tVersion: " + version)
        category_type = {"Kext": "Kexts", "Bootloader": "Bootloaders", "Utility": "Utilities", "Other": "Others"}[b_type]
        print("\tCopying to build directory...")
        extras = []
        # (extras.extend(self._expand_globs(i)) for i in extra_files) if extra_files is not None else None  # pylint: disable=expression-not-assigned
        if extra_files is not None:
            for i in extra_files:
                extras.extend(self._expand_globs(i))
        debug_file = self._expand_globs(d_file)[0]
        release_file = self._expand_globs(r_file)[0]
        debug_dir = self.build_dir / Path(category_type) / Path(name) / Path(commithash) / Path("Debug")
        release_dir = self.build_dir / Path(category_type) / Path(name) / Path(commithash) / Path("Release")
        for directory in [debug_dir, release_dir]:
            if directory.exists():
                shutil.rmtree(directory)
            directory.mkdir(parents=True)
        if extras:
            for i in extras:
                if i.is_dir():
                    print(f"\t{i} is a dir; please fix!")
                    shutil.copytree(i, debug_dir / i.name)
                    shutil.copytree(i, release_dir / i.name)
                elif i.is_file():
                    shutil.copy(i, debug_dir)
                    shutil.copy(i, release_dir)
                elif not i.exists():
                    print(f"\t{i} does not exist!")
                    return False
                else:
                    print(f"\t{i} is not a dir or a file!")
                    continue

        if debug_file.is_dir():
            print(f"{debug_file} is a dir; please fix!")
            shutil.copytree(debug_file, debug_dir / debug_file.name)
        elif debug_file.is_file():
            shutil.copy(debug_file, debug_dir)

        if release_file.is_dir():
            print(f"{release_file} is a dir; please fix!")
            shutil.copytree(release_file, release_dir / release_file.name)
        elif release_file.is_file():
            shutil.copy(release_file, release_dir)

        return {"debug": debug_dir / Path(debug_file.name), "release": release_dir / Path(release_file.name), "extras": [debug_dir / Path(i.name) for i in extras], "version": version}


================================================
FILE: check_ratelimit.py
================================================
import sys
from hammock import Hammock as hammock

token = sys.argv[1].strip()
eee = hammock("https://api.github.com/rate_limit").GET(auth=("github-actions", token))
print(eee.text or eee.content)


================================================
FILE: config_mgmt.py
================================================
import copy
import json
from pathlib import Path


def save_config(data: dict):
    config_dir = Path(__file__).parent.absolute() / Path("Config")
    plugin_dir = config_dir / Path("plugins")
    plugin_dir.mkdir(exist_ok=True)

    version = data["_version"]

    for plugin in data:
        if plugin == "_version":
            continue
        data[plugin]["versions"].sort(key=lambda x: (x["date_committed"], x["date_authored"]), reverse=True)
        json.dump(data[plugin] | {"_version": version}, (plugin_dir / Path(f"{plugin}.json")).open("w"), sort_keys=True)

    json.dump(data, (config_dir / Path("config.json")).open("w"), sort_keys=True)

    latest = copy.deepcopy(data)
    for plugin in latest:
        if plugin == "_version":
            continue
        latest[plugin]["versions"] = [latest[plugin]["versions"][0]]

    json.dump(latest, (config_dir / Path("latest.json")).open("w"), sort_keys=True)
    json.dump({"plugins": list(data.keys()), "_version": version}, (config_dir / Path("plugins.json")).open("w"), sort_keys=True)


================================================
FILE: downloader.py
================================================
import json
import distutils.util
import zipfile
from pathlib import Path
from hammock import Hammock as hammock

plugins = hammock("https://raw.githubusercontent.com/dortania/build-repo/github-actions/plugins.json").GET()
plugins = json.loads(plugins.text)

config = hammock("https://raw.githubusercontent.com/dortania/build-repo/builds/config.json").GET()
config = json.loads(config.text)
print("Global Settings: ")
ensure_latest = bool(distutils.util.strtobool(input("Ensure latest? (\"true\" or \"false\") ").lower()))
unzip = bool(distutils.util.strtobool(input("Unzip automatically and delete zip? (\"true\" or \"false\") ").lower()))
extract_dir = input("Put files in directory (leave blank for current dir): ") if unzip else None
dbg = input("Debug or release? (\"debug\" or \"release\") ").lower()
while True:
    target = input("Enter product to download (case sensitive): ")
    try:
        if ensure_latest:
            organization = repo = None
            for plugin in plugins["Plugins"]:
                if plugin["Name"] == target:
                    organization, repo = plugin["URL"].strip().replace("https://github.com/", "").split("/")
                    break
            if not repo:
                print("Product " + target + " not available\n")
                continue
            commits_url = hammock("https://api.github.com").repos(organization, repo).commits.GET(params={"per_page": 100})
            commit_hash = json.loads(commits_url.text or commits_url.content)[0]["sha"]
            to_dl = None
            for i in config[target]["versions"]:
                if i["commit"]["sha"] == commit_hash:
                    to_dl = i
                    break
            if not to_dl:
                print("Latest version (" + commit_hash + ") unavailable\n")
                continue
        else:
            to_dl = config[target]["versions"][0]
        dl_link = to_dl["links"][dbg]
        print(f"Downloading {target} version {to_dl['version']} sha {to_dl['commit']['sha']} and date built {to_dl['date_built']}")
    except KeyError as error:
        if error.args[0] == target:
            print("Product " + error.args[0] + " not available\n")
            continue
        elif error.args[0] == dbg:
            print("Version " + error.args[0] + " not available\n")
            continue
        else:
            raise error
    file_name = Path(dl_link).name
    dl_url = hammock(dl_link).GET()
    Path(file_name).write_bytes(dl_url.content or dl_url.text)
    print("Finished downloading.")
    if unzip:
        with zipfile.ZipFile(file_name, "r") as zip_ref:
            zip_ref.extractall(extract_dir)
        Path(file_name).unlink()
        print("Finished extracting.")
    print("Done.\n")


================================================
FILE: local-test.sh
================================================
pip3 install hammock python-dateutil datetime termcolor purl python-magic
rm -Rf Config Temp Builds
git clone https://github.com/dortania/build-repo.git Config --depth 1 --single-branch --branch builds --sparse --filter=blob:none
python3 -u check_ratelimit.py
python3 -u updater.py
python3 -u check_ratelimit.py
python3 -u update_config.py

================================================
FILE: notify.py
================================================
import json
import os
import sys

import cryptography.fernet as fernet
import requests
from hammock import Hammock as hammock

JOB_LINK = None

webhook = sys.argv[2].strip()
fern = fernet.Fernet(sys.argv[3].strip().encode())


def get_current_run_link(token):
    global JOB_LINK
    if JOB_LINK:
        return JOB_LINK
    this_run = hammock(f"https://api.github.com/repos/{os.environ['GITHUB_REPOSITORY']}/actions/runs/{os.environ['GITHUB_RUN_ID']}/jobs", auth=("github-actions", token)).GET()
    try:
        this_run.raise_for_status()
    except requests.HTTPError as err:
        print(err)
        return
    this_job = [i for i in this_run.json()["jobs"] if i["name"] == os.environ['JOB_NAME']][0]
    JOB_LINK = this_job["html_url"]
    return JOB_LINK


def notify(token, results, status):
    if os.environ.get("PROD", "false") == "true":
        results = dict(results)
        results["status"] = status
        results["job_url"] = get_current_run_link(token)
        if results.get("files"):
            results["files"] = {k: str(v) for k, v in results["files"].items()}

        requests.post(webhook, data=fern.encrypt(json.dumps(results).encode()))


def notify_success(token, results):
    notify(token, results, "succeeded")


def notify_failure(token, results):
    notify(token, results, "failed")


def notify_error(token, results):
    notify(token, results, "errored")


================================================
FILE: parallel_check.py
================================================
import os
import sys
import time

import requests

token = sys.argv[1].strip()

session = requests.Session()
session.auth = ("github-actions", token)

this_run_url = f"https://api.github.com/repos/{os.environ['GITHUB_REPOSITORY']}/actions/runs/{os.environ['GITHUB_RUN_ID']}"
workflow_url = session.get(this_run_url).json()["workflow_url"]

runs = session.get(f"{workflow_url}/runs").json()
run_index = 0

for i, run in enumerate(runs["workflow_runs"]):
    if str(run["id"]) == str(os.environ["GITHUB_RUN_ID"]):
        run_index = i
        break

for i, run in enumerate(runs["workflow_runs"]):
    if i > run_index and str(run["id"]) != str(os.environ["GITHUB_RUN_ID"]) and run["status"] != "completed":
        print(f"Another build ({run['id']} with status {run['status']}) is running, cancelling this one...")
        cancel_request = session.post(f"{this_run_url}/cancel")
        if cancel_request.status_code != 202:
            sys.exit(f"Status code did not match: {cancel_request.status_code}")
        else:
            print("Cancel request acknowledged, sleeping 10 seconds to account for delay...")
            time.sleep(10)
            sys.exit(0)


================================================
FILE: plugins.json
================================================
{
  "Plugins": [
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "An open source kernel extension providing a set of patches required for non-native Airport Broadcom Wi-Fi cards.",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "AirportBrcmFixup",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/AirportBrcmFixup"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "dynamic audio patching",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "AppleALC",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/AppleALC"
    },
    {
      "Build Opts": [
        "-target",
        "Package"
      ],
      "Debug File": "build/Debug/*.zip",
      "Desc": "An open source kernel extension which applies PatchRAM updates for Broadcom RAMUSB based devices",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "BrcmPatchRAM",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/BrcmPatchRAM"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "Handler for brightness keys without DSDT patches",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "BrightnessKeys",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/BrightnessKeys"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "Dynamic macOS CPU power management data injection",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "CPUFriend",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/CPUFriend"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "Combines functionality of VoodooTSCSync and disabling xcpm_urgency if TSC is not in sync",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "CpuTscSync",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/CpuTscSync"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "Various patches to install Rosetta cryptex",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "CryptexFixup",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/CryptexFixup"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "A Lilu plugin intended to enable debug output in the macOS kernel",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "DebugEnhancer",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/DebugEnhancer"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "Allows reading Embedded Controller fields over 1 byte long",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "ECEnabler",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/1Revenger1/ECEnabler"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "SD host controller support for macOS",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "EmeraldSDHC",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/EmeraldSDHC"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "Lilu Kernel extension for enabling Sidecar, NightShift, AirPlay to Mac and Universal Control support",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "FeatureUnlock",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/FeatureUnlock"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "A Lilu plugin intended to fix hibernation compatibility issues",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "HibernationFixup",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/HibernationFixup"
    },
    {
      "Build Opts": [
        "-alltargets"
      ],
      "Debug File": "build/Debug/*.zip",
      "Desc": "Intel Bluetooth Drivers for macOS",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "IntelBluetoothFirmware",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/OpenIntelWireless/IntelBluetoothFirmware"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "Intel Ethernet LAN driver for macOS",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "IntelMausi",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/IntelMausi"
    },
    {
      "32-bit": true,
      "Debug File": "build/Debug/*.zip",
      "Desc": "for arbitrary kext, library, and program patching",
      "MacKernelSDK": true,
      "Name": "Lilu",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/Lilu"
    },
    {
      "Build Opts": [
        "-target",
        "Package"
      ],
      "Debug File": "build/Debug/*.zip",
      "Desc": "Hyper-V integration support for macOS",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "MacHyperVSupport",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/MacHyperVSupport"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "patches for the Apple NVMe storage driver, IONVMeFamily",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "NVMeFix",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/NVMeFix"
    },
    {
      "Command": [
        {
          "args": [],
          "name": "Building DuetPkg",
          "path": "./build_duet.tool"
        },
        {
          "args": [],
          "name": "Building OpenCorePkg",
          "path": "./build_oc.tool"
        }
      ],
      "Debug File": "Binaries/*DEBUG*.zip",
      "Desc": "OpenCore front end",
      "Max Per Run": 2,
      "Name": "OpenCorePkg",
      "Release File": "Binaries/*RELEASE*.zip",
      "Type": "Bootloader",
      "URL": "https://github.com/acidanthera/OpenCorePkg",
      "Version": [
        "awk",
        "/^#define OPEN_CORE_VERSION/ { print substr($3,2,5) }",
        "Include/Acidanthera/Library/OcMainLib.h"
      ]
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "open source kernel extension providing a way to emulate some offsets in your CMOS (RTC) memory",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "RTCMemoryFixup",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/RTCMemoryFixup"
    },
    {
      "Debug File": "debug.zip",
      "Desc": "OS X open source driver for the Realtek RTL8111/8168 family",
      "Name": "RealtekRTL8111",
      "MacKernelSDK": true,
      "Post-Build": [
        {
          "args": [
            "-r",
            "-X",
            "../../release.zip",
            "RealtekRTL8111.kext"
          ],
          "cwd": "build/Release",
          "name": "Zip Release Directory",
          "path": "zip"
        },
        {
          "args": [
            "-r",
            "-X",
            "../../debug.zip",
            "RealtekRTL8111.kext"
          ],
          "cwd": "build/Debug",
          "name": "Zip Debug Directory",
          "path": "zip"
        }
      ],
      "Release File": "release.zip",
      "URL": "https://github.com/Mieze/RTL8111_driver_for_OS_X"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "Lilu kernel extension for blocking unwanted processes and unlocking support for certain features restricted to other hardware",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "RestrictEvents",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/RestrictEvents"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "Serial mouse kernel extension for macOS",
      "MacKernelSDK": true,
      "Name": "SerialMouse",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/Goldfish64/SerialMouse"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "UEFI framebuffer driver for macOS",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "UEFIGraphicsFB",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/UEFIGraphicsFB"
    },
    {
      "32-bit": true,
      "Build Opts": [
        "-target",
        "Package"
      ],
      "Debug File": "build/Debug/*.zip",
      "Desc": "advanced Apple SMC emulator in the kernel",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "VirtualSMC",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/VirtualSMC"
    },
    {
      "Build Dir": "build/Build/Products/",
      "Build Opts": [
        "-workspace",
        "VoodooI2C.xcworkspace",
        "-scheme",
        "VoodooI2C"
      ],
      "Debug File": "build/Build/Products/Debug/debug.zip",
      "Desc": "Intel I2C controller and slave device drivers for macOS",
      "Extras": [
        "build/Build/Products/Release/release-dSYM.zip"
      ],
      "MacKernelSDK": true,
      "Name": "VoodooI2C",
      "Post-Build": [
        {
          "args": [
            "-r",
            "-X",
            "release.zip",
            ".",
            "-i",
            "./*.kext/*"
          ],
          "cwd": "build/Build/Products/Release",
          "name": "Zip Release Directory",
          "path": "zip"
        },
        {
          "args": [
            "-r",
            "-X",
            "release-dSYM.zip",
            ".",
            "-i",
            "./*.dSYM/*"
          ],
          "cwd": "build/Build/Products/Release",
          "name": "Zip Release dSYM",
          "path": "zip"
        },
        {
          "args": [
            "-r",
            "-X",
            "debug.zip",
            ".",
            "-i",
            "./*.kext/*"
          ],
          "cwd": "build/Build/Products/Debug",
          "name": "Zip Debug Directory",
          "path": "zip"
        }
      ],
      "Pre-Build": [
        {
          "args": [
            "-LfsO",
            "https://raw.githubusercontent.com/acidanthera/VoodooInput/master/VoodooInput/Scripts/bootstrap.sh"
          ],
          "name": "Download VoodooInput Bootstrap Script",
          "path": "curl"
        },
        {
          "args": [
            "+x",
            "bootstrap.sh"
          ],
          "name": "Make Bootstrap Executable",
          "path": "chmod"
        },
        {
          "args": [],
          "name": "Run VoodooInput Bootstrap",
          "path": "./bootstrap.sh"
        },
        {
          "args": [
            "VoodooInput",
            "Dependencies/"
          ],
          "name": "Move VoodooInput to Dependencies",
          "path": "mv"
        }
      ],
      "Release File": "build/Build/Products/Release/release.zip",
      "Type": "Kext",
      "URL": "https://github.com/VoodooI2C/VoodooI2C"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "Generic Multitouch Handler kernel extension for macOS",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "VoodooInput",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/VoodooInput"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "PS2 controller kext",
      "Info": "build/Release/VoodooPS2Controller.kext/Contents/Info.plist",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "VoodooPS2",
      "Pre-Build": [
        {
          "args": [
            "-LfsO",
            "https://raw.githubusercontent.com/acidanthera/VoodooInput/master/VoodooInput/Scripts/bootstrap.sh"
          ],
          "name": "Download VoodooInput Bootstrap Script",
          "path": "curl"
        },
        {
          "args": [
            "+x",
            "bootstrap.sh"
          ],
          "name": "Make Bootstrap Executable",
          "path": "chmod"
        },
        {
          "args": [],
          "name": "Run VoodooInput Bootstrap",
          "path": "./bootstrap.sh"
        }
      ],
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/VoodooPS2"
    },
    {
      "Command": "make",
      "Debug File": "build/Debug/*.zip",
      "Desc": "Refined macOS driver for ALPS TouchPads",
      "Info": "VoodooPS2Controller.kext/Contents/Info.plist",
      "Name": "VoodooPS2-Alps",
      "Post-Build": [
        {
          "args": [
            "-r",
            "-X",
            "release.zip",
            "."
          ],
          "cwd": "build/Release",
          "name": "Zip Release Directory",
          "path": "zip"
        },
        {
          "args": [
            "-r",
            "-X",
            "debug.zip",
            "."
          ],
          "cwd": "build/Debug",
          "name": "Zip Debug Directory",
          "path": "zip"
        }
      ],
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/1Revenger1/VoodooPS2-Alps"
    },
    {
      "Build Dir": "build/Build/Products/",
      "Build Opts": [
        "-scheme",
        "VoodooRMI"
      ],
      "Debug File": "build/Build/Products/Debug/*.zip",
      "Desc": "Synaptic Trackpad driver over SMBus/I2C for macOS",
      "MacKernelSDK": true,
      "Name": "VoodooRMI",
      "Release File": "build/Build/Products/Release/*.zip",
      "Type": "Kext",
      "URL": "https://github.com/VoodooSMBus/VoodooRMI"
    },
    {
      "Build Dir": "build/Build/Products/",
      "Build Opts": [
        "-scheme",
        "VoodooSMBus"
      ],
      "Debug File": "build/Build/Products/Debug/debug.zip",
      "Desc": "i2c-i801 driver port for macOS X + ELAN SMBus macOS X driver for Thinkpad T480s, L380, P52",
      "Name": "VoodooSMBus",
      "Post-Build": [
        {
          "args": [
            "-r",
            "-X",
            "release.zip",
            ".",
            "-i",
            "./*.kext/*"
          ],
          "cwd": "build/Build/Products/Release",
          "name": "Zip Release Directory",
          "path": "zip"
        },
        {
          "args": [
            "-r",
            "-X",
            "debug.zip",
            ".",
            "-i",
            "./*.kext/*"
          ],
          "cwd": "build/Build/Products/Debug",
          "name": "Zip Debug Directory",
          "path": "zip"
        }
      ],
      "Release File": "build/Build/Products/Release/release.zip",
      "Type": "Kext",
      "URL": "https://github.com/VoodooSMBus/VoodooSMBus"
    },
    {
      "Debug File": "build/Debug/*.zip",
      "Desc": "provides patches for AMD/Nvidia/Intel GPUs",
      "Lilu": true,
      "MacKernelSDK": true,
      "Name": "WhateverGreen",
      "Release File": "build/Release/*.zip",
      "URL": "https://github.com/acidanthera/WhateverGreen"
    },
    {
      "Build Opts": [
        "-arch",
        "x86_64",
        "-project",
        "gfxutil.xcodeproj",
        "ONLY_ACTIVE_ARCH=NO"
      ],
      "Debug File": "build/Debug/*.zip",
      "Desc": "OpenCore front end",
      "EDK II": true,
      "MacKernelSDK": true,
      "Name": "gfxutil",
      "Release File": "build/Release/*.zip",
      "Type": "Utility",
      "URL": "https://github.com/acidanthera/gfxutil",
      "Version": [
        "awk",
        "/^#define VERSION/ { print substr($3,2,5) }",
        "main.h"
      ]
    }
  ]
}


================================================
FILE: requirements.txt
================================================
# Builder dependencies
hammock
python-dateutil
datetime
termcolor2
purl
python-magic
humanize
gitpython
cryptography

# For ACID32
macholib

================================================
FILE: sort_plugins.py
================================================
from pathlib import Path
import json

plugins = json.load(Path("plugins.json").open())
plugins["Plugins"].sort(key=lambda x: x["Name"])
json.dump(plugins, Path("plugins.json").open("w"), indent=2, sort_keys=True)


================================================
FILE: update_config.py
================================================
import copy
import json
import os
import sys
import urllib.parse
from pathlib import Path

import dateutil.parser
import git
from hammock import Hammock as hammock

from config_mgmt import save_config

token = sys.argv[1].strip()


config: dict = json.load(Path("Config/config.json").open())
plugins = json.load(Path("plugins.json").open())

# version 2 to 3

if config["_version"] == 2:
    def add_author_date(name, version):
        if version.get("date_authored", None):
            return version
        else:
            organization = repo = None
            for plugin in plugins["Plugins"]:
                if name == "AppleSupportPkg" or name == "BT4LEContinuityFixup":
                    repo = name
                    organization = "acidanthera"
                    break
                elif name == "NoTouchID":
                    repo = name
                    organization = "al3xtjames"
                    break
                if plugin["Name"] == name:
                    organization, repo = plugin["URL"].strip().replace("https://github.com/", "").split("/")
                    break
            if not repo:
                print("Product " + name + " not found")
                raise Exception
            commit_date = dateutil.parser.parse(
                json.loads(hammock("https://api.github.com").repos(organization, repo).commits(version["commit"]["sha"]).GET(auth=("github-actions", token)).text)["commit"]["author"]["date"]
            )
            version["date_authored"] = commit_date.isoformat()
            return version

    config = {i: v for i, v in config.items() if not i.startswith("_")}

    for i in config:
        for j, item in enumerate(config[i]["versions"]):
            config[i]["versions"][j] = add_author_date(i, item)
            print(f"Added {config[i]['versions'][j]['date_authored']} for {i} {config[i]['versions'][j]['commit']['sha']}")

    for i in config:
        for j, item in enumerate(config[i]["versions"]):
            if not config[i]["versions"][j].get("date_committed"):
                config[i]["versions"][j]["date_committed"] = config[i]["versions"][j].pop("datecommitted")
            if not config[i]["versions"][j].get("date_built"):
                config[i]["versions"][j]["date_built"] = config[i]["versions"][j].pop("dateadded")

        config[i]["versions"].sort(key=lambda x: (x["date_committed"], x["date_authored"]), reverse=True)

    config["_version"] = 3

# version 3 to 4
# nothing changed, but the other json files were added

if config["_version"] == 3:
    config["_version"] = 4

save_config(config)

if os.environ.get("PROD", "false") == "true":
    repo = git.Repo("Config")
    if repo.is_dirty(untracked_files=True):
        repo.git.add(all=True)
        repo.git.commit(message="Deploying to builds")
        repo.git.push()


================================================
FILE: updater.py
================================================
import datetime
import json
import os
import sys
import traceback
from pathlib import Path

import dateutil.parser
import git
import humanize
from hammock import Hammock as hammock
from termcolor2 import c as color

import builder
from add import add_built
from notify import notify_error, notify_failure, notify_success


def matched_key_in_dict_array(array, key, value):
    if not array:
        return False
    for dictionary in array:
        if dictionary.get(key, None) == value:
            return True
    return False


MAX_OUTSTANDING_COMMITS = 3
DATE_DELTA = 7
RETRIES_BEFORE_FAILURE = 2

theJSON = json.load(Path("plugins.json").open())
plugins = theJSON.get("Plugins", [])

config_dir = Path("Config").resolve()

config = json.load((config_dir / Path("config.json")).open())
failures = json.load((config_dir / Path("failures.json")).open())


def add_to_failures(plugin):
    if not failures.get(plugin["plugin"]["Name"]):
        failures[plugin["plugin"]["Name"]] = {plugin["commit"]["sha"]: 1}
    elif not failures[plugin["plugin"]["Name"]].get(plugin["commit"]["sha"]):
        failures[plugin["plugin"]["Name"]][plugin["commit"]["sha"]] = 1
    else:
        failures[plugin["plugin"]["Name"]][plugin["commit"]["sha"]] += 1


last_updated_path = config_dir / Path("last_updated.txt")

info = []
to_build = []
to_add = []

if last_updated_path.is_file() and last_updated_path.stat().st_size != 0:
    date_to_compare = dateutil.parser.parse(last_updated_path.read_text())
    last_updated_path.write_text(datetime.datetime.now(tz=datetime.timezone.utc).isoformat())
else:
    last_updated_path.touch()
    date_to_compare = datetime.datetime(2021, 3, 1, tzinfo=datetime.timezone.utc)
    last_updated_path.write_text(date_to_compare.isoformat())

print("Last update date is " + date_to_compare.isoformat())

token = sys.argv[1].strip()

for plugin in plugins:
    organization, repo = plugin["URL"].strip().replace("https://github.com/", "").split("/")
    base_url = hammock("https://api.github.com")

    releases_url = base_url.repos(organization, repo).releases.GET(auth=("github-actions", token), params={"per_page": 100})
    releases = json.loads(releases_url.text or releases_url.content)
    if releases_url.headers.get("Link"):
        print(releases_url.headers["Link"])

    commits_url = base_url.repos(organization, repo).commits.GET(auth=("github-actions", token), params={"per_page": 100})
    commits = json.loads(commits_url.text or commits_url.content)
    if releases_url.headers.get("Link"):
        print(releases_url.headers["Link"])

    count = 1

    for commit in commits:
        commit_date = dateutil.parser.parse(commit["commit"]["committer"]["date"])
        newer = commit_date >= date_to_compare - datetime.timedelta(days=DATE_DELTA)

        if isinstance(plugin.get("Force", None), str):
            force_build = commit["sha"] == plugin.get("Force")
        else:
            force_build = plugin.get("Force") and commits.index(commit) == 0

        not_in_repo = True
        for i in config.get(plugin["Name"], {}).get("versions", []):
            if i["commit"]["sha"] == commit["sha"]:
                not_in_repo = False

        hit_failure_threshold = failures.get(plugin["Name"], {}).get(commit["sha"], 0) > RETRIES_BEFORE_FAILURE
        within_max_outstanding = count <= plugin.get("Max Per Run", MAX_OUTSTANDING_COMMITS)

        # Do not build if we hit the limit for builds per run for this plugin.
        if not within_max_outstanding:
            continue

        # Build if:
        # Newer than last checked and not in repo, OR not in repo and latest commit
        # AND must not have hit failure threshold (retries >= RETRIES_BEFORE_FAILURE)
        # OR Force is set to true (ignores blacklist as this is manual intervention)

        if (((newer and not_in_repo) or (not_in_repo and commits.index(commit) == 0)) and not hit_failure_threshold) or force_build:
            if commits.index(commit) == 0:
                print(plugin["Name"] + " by " + organization + " latest commit (" + commit_date.isoformat() + ") not built")
            else:
                print(plugin["Name"] + " by " + organization + " commit " + commit["sha"] + " (" + commit_date.isoformat() + ") not built")
            to_build.append({"plugin": plugin, "commit": commit})
            count += 1
        elif hit_failure_threshold:
            print(plugin["Name"] + " by " + organization + " commit " + commit["sha"] + " (" + commit_date.isoformat() + ") has hit failure threshold!")

    for release in releases:
        release_date = dateutil.parser.parse(release["created_at"])
        if release_date >= date_to_compare:
            if releases.index(release) == 0:
                print(plugin["Name"] + " by " + organization + " latest release (" + release_date.isoformat() + ") not added")
            else:
                print(plugin["Name"] + " by " + organization + " release " + release["name"] + " (" + release_date.isoformat() + ") not added")
            to_add.append({"plugin": plugin, "release": release})


# for i in to_add: addRelease(i)


# Start setting up builder here.
builder = builder.Builder()

failed = []
succeeded = []
errored = []

print(color(f"\nBuilding {len(to_build)} things").bold)
for plugin in to_build:
    print(f"\nBuilding {color(plugin['plugin']['Name']).bold}")
    try:
        started = datetime.datetime.now()
        files = None
        files = builder.build(plugin["plugin"], commithash=plugin["commit"]["sha"])
    except Exception as error:
        duration = datetime.datetime.now() - started

        print("An error occurred!")
        print(error)
        traceback.print_tb(error.__traceback__)
        if files:
            print(f"Files: {files}")

        print(f"{color('Building of').red} {color(plugin['plugin']['Name']).red.bold} {color('errored').red}")
        print(f"Took {humanize.naturaldelta(duration)}")
        notify_error(token, plugin)
        errored.append(plugin)
        add_to_failures(plugin)
        continue

    duration = datetime.datetime.now() - started

    if files:
        print(f"{color('Building of').green} {color(plugin['plugin']['Name']).green.bold} {color('succeeded').green}")
        print(f"Took {humanize.naturaldelta(duration)}")

        results = plugin
        results["files"] = files

        print("Adding to config...")
        results["config_item"] = add_built(results, token)
        notify_success(token, results)
        succeeded.append(results)
    else:
        print(f"{color('Building of').red} {color(plugin['plugin']['Name']).red.bold} {color('failed').red}")
        print(f"Took {humanize.naturaldelta(duration)}")

        notify_failure(token, plugin)
        failed.append(plugin)
        add_to_failures(plugin)

print(color(f"\n{len(succeeded)} of {len(to_build)} built successfully\n").bold)
if len(succeeded) > 0:
    print(color("Succeeded:").green)
    for i in succeeded:
        print(i["plugin"]["Name"])
if len(failed) > 0:
    print(color("\nFailed:").red)
    for i in failed:
        print(i["plugin"]["Name"])
if len(errored) > 0:
    print(color("\nErrored:").red)
    for i in errored:
        print(i["plugin"]["Name"])

json.dump(failures, (config_dir / Path("failures.json")).open("w"), indent=2, sort_keys=True)


if os.environ.get("PROD", "false") == "true":
    repo = git.Repo(config_dir)
    if repo.is_dirty(untracked_files=True):
        repo.git.add(all=True)
        repo.git.commit(message="Deploying to builds")
        repo.git.push()


if len(failed) > 0 or len(errored) > 0:
    sys.exit(10)
Download .txt
gitextract_q33h6l_2/

├── .dccache
├── .flake8
├── .github/
│   └── workflows/
│       └── workflow.yaml
├── .gitignore
├── .pylintrc
├── README.md
├── add.py
├── builder.py
├── check_ratelimit.py
├── config_mgmt.py
├── downloader.py
├── local-test.sh
├── notify.py
├── parallel_check.py
├── plugins.json
├── requirements.txt
├── sort_plugins.py
├── update_config.py
└── updater.py
Download .txt
SYMBOL INDEX (21 symbols across 6 files)

FILE: add.py
  function hash_file (line 19) | def hash_file(file_path: Path):
  function expand_globs (line 23) | def expand_globs(str_path: str):
  function upload_release_asset (line 29) | def upload_release_asset(release_id, token, file_path: Path):
  function paginate (line 48) | def paginate(url, token):
  function add_built (line 60) | def add_built(plugin, token):

FILE: builder.py
  class Builder (line 13) | class Builder:
    method __init__ (line 14) | def __init__(self):
    method _expand_globs (line 31) | def _expand_globs(p: str):
    method _bootstrap_clang32 (line 39) | def _bootstrap_clang32(self, target_dir: Path):
    method _bootstrap_edk2 (line 61) | def _bootstrap_edk2(self):
    method _build_lilu (line 75) | def _build_lilu(self):
    method build (line 113) | def build(self, plugin, commithash=None):

FILE: config_mgmt.py
  function save_config (line 6) | def save_config(data: dict):

FILE: notify.py
  function get_current_run_link (line 15) | def get_current_run_link(token):
  function notify (line 30) | def notify(token, results, status):
  function notify_success (line 41) | def notify_success(token, results):
  function notify_failure (line 45) | def notify_failure(token, results):
  function notify_error (line 49) | def notify_error(token, results):

FILE: update_config.py
  function add_author_date (line 23) | def add_author_date(name, version):

FILE: updater.py
  function matched_key_in_dict_array (line 19) | def matched_key_in_dict_array(array, key, value):
  function add_to_failures (line 41) | def add_to_failures(plugin):
Condensed preview — 19 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (65K chars).
[
  {
    "path": ".dccache",
    "chars": 1759,
    "preview": "[{\"/Users/dhinak/Documents/GitHub/build-repo/.pylintrc\":\"1\",\"/Users/dhinak/Documents/GitHub/build-repo/add.py\":\"2\",\"/Use"
  },
  {
    "path": ".flake8",
    "chars": 35,
    "preview": "[flake8]\nextend-ignore = E501, E203"
  },
  {
    "path": ".github/workflows/workflow.yaml",
    "chars": 2376,
    "preview": "name: Build\non:\n  push:\n  schedule:\n    - cron: '*/5 * * * *'\n  workflow_dispatch:\nenv:\n  FORCE_INSTALL: 1\n  HAS_OPENSSL"
  },
  {
    "path": ".gitignore",
    "chars": 84,
    "preview": "gh token.txt\nLilu-and-Friends/\n__pycache__/\nBuilds/\nConfig/\nTemp/\n.vscode/\n.DS_Store"
  },
  {
    "path": ".pylintrc",
    "chars": 359,
    "preview": "[MASTER]\n\ninit-hook=\"from pylint.config import find_pylintrc; import os, sys; sys.path.append(os.path.dirname(find_pylin"
  },
  {
    "path": "README.md",
    "chars": 199,
    "preview": "# build-repo\n\n![Build](https://github.com/dortania/build-repo/workflows/Build/badge.svg)\n\nCredit CorpNewt for Lilu & Fri"
  },
  {
    "path": "add.py",
    "chars": 6601,
    "preview": "import datetime\nimport hashlib\nimport json\nimport os\nimport time\nfrom pathlib import Path\n\nimport dateutil.parser\nimport"
  },
  {
    "path": "builder.py",
    "chars": 15085,
    "preview": "import io\nimport plistlib\nimport shutil\nimport stat\nimport subprocess\nimport zipfile\nfrom os import chdir\nfrom pathlib i"
  },
  {
    "path": "check_ratelimit.py",
    "chars": 197,
    "preview": "import sys\nfrom hammock import Hammock as hammock\n\ntoken = sys.argv[1].strip()\neee = hammock(\"https://api.github.com/rat"
  },
  {
    "path": "config_mgmt.py",
    "chars": 1079,
    "preview": "import copy\r\nimport json\r\nfrom pathlib import Path\r\n\r\n\r\ndef save_config(data: dict):\r\n    config_dir = Path(__file__).pa"
  },
  {
    "path": "downloader.py",
    "chars": 2748,
    "preview": "import json\nimport distutils.util\nimport zipfile\nfrom pathlib import Path\nfrom hammock import Hammock as hammock\n\nplugin"
  },
  {
    "path": "local-test.sh",
    "chars": 339,
    "preview": "pip3 install hammock python-dateutil datetime termcolor purl python-magic\nrm -Rf Config Temp Builds\ngit clone https://gi"
  },
  {
    "path": "notify.py",
    "chars": 1397,
    "preview": "import json\nimport os\nimport sys\n\nimport cryptography.fernet as fernet\nimport requests\nfrom hammock import Hammock as ha"
  },
  {
    "path": "parallel_check.py",
    "chars": 1166,
    "preview": "import os\nimport sys\nimport time\n\nimport requests\n\ntoken = sys.argv[1].strip()\n\nsession = requests.Session()\nsession.aut"
  },
  {
    "path": "plugins.json",
    "chars": 15450,
    "preview": "{\n  \"Plugins\": [\n    {\n      \"Debug File\": \"build/Debug/*.zip\",\n      \"Desc\": \"An open source kernel extension providing"
  },
  {
    "path": "requirements.txt",
    "chars": 151,
    "preview": "# Builder dependencies\r\nhammock\r\npython-dateutil\r\ndatetime\r\ntermcolor2\r\npurl\r\npython-magic\r\nhumanize\r\ngitpython\r\ncryptog"
  },
  {
    "path": "sort_plugins.py",
    "chars": 213,
    "preview": "from pathlib import Path\nimport json\n\nplugins = json.load(Path(\"plugins.json\").open())\nplugins[\"Plugins\"].sort(key=lambd"
  },
  {
    "path": "update_config.py",
    "chars": 2842,
    "preview": "import copy\nimport json\nimport os\nimport sys\nimport urllib.parse\nfrom pathlib import Path\n\nimport dateutil.parser\nimport"
  },
  {
    "path": "updater.py",
    "chars": 7615,
    "preview": "import datetime\nimport json\nimport os\nimport sys\nimport traceback\nfrom pathlib import Path\n\nimport dateutil.parser\nimpor"
  }
]

About this extraction

This page contains the full source code of the dortania/build-repo GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 19 files (58.3 KB), approximately 15.2k tokens, and a symbol index with 21 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!