[
  {
    "path": ".github/CONTRIBUTING.md",
    "content": "Thank you for your interest in contributing to nixpkgs-update.\n\n# Licensing\n\nPlease indicate you license your contributions by commenting:\n\n```\nI hereby license my contributions to this repository under:\nCreative Commons Zero v1.0 Universal (SPDX Short Identifier: CC0-1.0)\n```\n\nin this [Pull Request thread](https://github.com/nix-community/nixpkgs-update/pull/116).\n"
  },
  {
    "path": ".github/FUNDING.yml",
    "content": "github: ryantm\npatreon: nixpkgsupdate\n"
  },
  {
    "path": ".github/dependabot.yml",
    "content": "version: 2\nupdates:\n  - package-ecosystem: \"github-actions\"\n    directory: \"/\"\n    schedule:\n      interval: \"weekly\"\n"
  },
  {
    "path": ".github/workflows/doc.yaml",
    "content": "name: doc\n\non:\n  push:\n    branches:\n      - main\n  workflow_dispatch:\n\npermissions:\n  contents: read\n  pages: write\n  id-token: write\n\nconcurrency:\n  group: \"doc\"\n  cancel-in-progress: false\n\njobs:\n  publish:\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/checkout@v6\n      - uses: cachix/install-nix-action@v31\n        with:\n          extra_nix_config: |\n            accept-flake-config = true\n            experimental-features = nix-command flakes\n      - name: Setup Pages\n        id: pages\n        uses: actions/configure-pages@v6\n      - name: Build Pages\n        run: |\n          nix build .#nixpkgs-update-doc\n      - name: Upload artifact\n        uses: actions/upload-pages-artifact@v5\n        with:\n          path: ./result/multi\n\n  deploy:\n    environment:\n      name: github-pages\n      url: ${{ steps.deployment.outputs.page_url }}\n    runs-on: ubuntu-latest\n    needs: publish\n    steps:\n      - name: Deploy to GitHub Pages\n        id: deployment\n        uses: actions/deploy-pages@v5\n"
  },
  {
    "path": ".github/workflows/flake-updates.yml",
    "content": "name: \"Update flakes\"\non:\n  workflow_dispatch:\n  schedule:\n    - cron: \"0 0 1 * *\"\njobs:\n  createPullRequest:\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/checkout@v6\n      - name: Install Nix\n        uses: cachix/install-nix-action@v31\n        with:\n          extra_nix_config: |\n            experimental-features = nix-command flakes\n      - name: Update flake.lock\n        uses: DeterminateSystems/update-flake-lock@v28\n"
  },
  {
    "path": ".gitignore",
    "content": ".ghc*\n/github_token.txt\n/packages-to-update.txt\n/result\n/result-doc\ndist-newstyle/\ndist/\ntest_data/actual*\n.direnv\n"
  },
  {
    "path": "CVENOTES.org",
    "content": "* Issues\n** https://github.com/NixOS/nixpkgs/pull/74184#issuecomment-565891652\n* Fixed\n** uzbl: 0.9.0 -> 0.9.1\n  - [[https://nvd.nist.gov/vuln/detail/CVE-2010-0011][CVE-2010-0011]]\n  - [[https://nvd.nist.gov/vuln/detail/CVE-2010-2809][CVE-2010-2809]]\n\n  Both CVEs refer to matchers that are date based releases, but the\n  author of the library switched to normal version numbering after\n  that, so these CVEs are reported as relevant even though they are\n  not.\n** terraform: 0.12.7 -> 0.12.9\n   - [[https://nvd.nist.gov/vuln/detail/CVE-2018-9057][CVE-2018-9057]]\n\n   https://nvd.nist.gov/products/cpe/detail/492339?keyword=cpe:2.3:a:hashicorp:terraform:1.12.0:*:*:*:*:aws:*:*&status=FINAL,DEPRECATED&orderBy=CPEURI&namingFormat=2.3\n\n   CVE only applies to terraform-providers-aws, but you can only tell that by looking at the \"Target Software\" part.\n** tor: 0.4.1.5 -> 0.4.1.6\n   https://nvd.nist.gov/vuln/detail/CVE-2017-16541\n\n  the CPE mistakenly uses tor for the product id when the product id should be torbrowser\n** arena: 1.1 -> 1.06\n  - [[https://nvd.nist.gov/vuln/detail/CVE-2018-8843][CVE-2018-8843]]\n  - [[https://nvd.nist.gov/vuln/detail/CVE-2019-15567][CVE-2019-15567]]\n\n   Not rockwellautomation:arena\n   Not openforis:arena\n** thrift\n   Apache Thrift vs Facebook Thrift\n** go: 1.13.3 -> 1.13.4\n   https://github.com/NixOS/nixpkgs/pull/72516\n\n   Looks like maybe go used to use dates for versions and now uses\n   regular versions\n** kanboard: 1.2.11 -> 1.2.12\n   https://github.com/NixOS/nixpkgs/pull/74429\n   cve is about a kanboard plugin provided by jenkins not kanboard itself\n"
  },
  {
    "path": "LICENSE",
    "content": "Creative Commons Legal Code\n\nCC0 1.0 Universal\n\n    CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE\n    LEGAL SERVICES. DISTRIBUTION OF THIS DOCUMENT DOES NOT CREATE AN\n    ATTORNEY-CLIENT RELATIONSHIP. CREATIVE COMMONS PROVIDES THIS\n    INFORMATION ON AN \"AS-IS\" BASIS. CREATIVE COMMONS MAKES NO WARRANTIES\n    REGARDING THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS\n    PROVIDED HEREUNDER, AND DISCLAIMS LIABILITY FOR DAMAGES RESULTING FROM\n    THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS PROVIDED\n    HEREUNDER.\n\nStatement of Purpose\n\nThe laws of most jurisdictions throughout the world automatically confer\nexclusive Copyright and Related Rights (defined below) upon the creator\nand subsequent owner(s) (each and all, an \"owner\") of an original work of\nauthorship and/or a database (each, a \"Work\").\n\nCertain owners wish to permanently relinquish those rights to a Work for\nthe purpose of contributing to a commons of creative, cultural and\nscientific works (\"Commons\") that the public can reliably and without fear\nof later claims of infringement build upon, modify, incorporate in other\nworks, reuse and redistribute as freely as possible in any form whatsoever\nand for any purposes, including without limitation commercial purposes.\nThese owners may contribute to the Commons to promote the ideal of a free\nculture and the further production of creative, cultural and scientific\nworks, or to gain reputation or greater distribution for their Work in\npart through the use and efforts of others.\n\nFor these and/or other purposes and motivations, and without any\nexpectation of additional consideration or compensation, the person\nassociating CC0 with a Work (the \"Affirmer\"), to the extent that he or she\nis an owner of Copyright and Related Rights in the Work, voluntarily\nelects to apply CC0 to the Work and publicly distribute the Work under its\nterms, with knowledge of his or her Copyright and Related Rights in the\nWork and the meaning and intended legal effect of CC0 on those rights.\n\n1. Copyright and Related Rights. A Work made available under CC0 may be\nprotected by copyright and related or neighboring rights (\"Copyright and\nRelated Rights\"). Copyright and Related Rights include, but are not\nlimited to, the following:\n\n  i. the right to reproduce, adapt, distribute, perform, display,\n     communicate, and translate a Work;\n ii. moral rights retained by the original author(s) and/or performer(s);\niii. publicity and privacy rights pertaining to a person's image or\n     likeness depicted in a Work;\n iv. rights protecting against unfair competition in regards to a Work,\n     subject to the limitations in paragraph 4(a), below;\n  v. rights protecting the extraction, dissemination, use and reuse of data\n     in a Work;\n vi. database rights (such as those arising under Directive 96/9/EC of the\n     European Parliament and of the Council of 11 March 1996 on the legal\n     protection of databases, and under any national implementation\n     thereof, including any amended or successor version of such\n     directive); and\nvii. other similar, equivalent or corresponding rights throughout the\n     world based on applicable law or treaty, and any national\n     implementations thereof.\n\n2. Waiver. To the greatest extent permitted by, but not in contravention\nof, applicable law, Affirmer hereby overtly, fully, permanently,\nirrevocably and unconditionally waives, abandons, and surrenders all of\nAffirmer's Copyright and Related Rights and associated claims and causes\nof action, whether now known or unknown (including existing as well as\nfuture claims and causes of action), in the Work (i) in all territories\nworldwide, (ii) for the maximum duration provided by applicable law or\ntreaty (including future time extensions), (iii) in any current or future\nmedium and for any number of copies, and (iv) for any purpose whatsoever,\nincluding without limitation commercial, advertising or promotional\npurposes (the \"Waiver\"). Affirmer makes the Waiver for the benefit of each\nmember of the public at large and to the detriment of Affirmer's heirs and\nsuccessors, fully intending that such Waiver shall not be subject to\nrevocation, rescission, cancellation, termination, or any other legal or\nequitable action to disrupt the quiet enjoyment of the Work by the public\nas contemplated by Affirmer's express Statement of Purpose.\n\n3. Public License Fallback. Should any part of the Waiver for any reason\nbe judged legally invalid or ineffective under applicable law, then the\nWaiver shall be preserved to the maximum extent permitted taking into\naccount Affirmer's express Statement of Purpose. In addition, to the\nextent the Waiver is so judged Affirmer hereby grants to each affected\nperson a royalty-free, non transferable, non sublicensable, non exclusive,\nirrevocable and unconditional license to exercise Affirmer's Copyright and\nRelated Rights in the Work (i) in all territories worldwide, (ii) for the\nmaximum duration provided by applicable law or treaty (including future\ntime extensions), (iii) in any current or future medium and for any number\nof copies, and (iv) for any purpose whatsoever, including without\nlimitation commercial, advertising or promotional purposes (the\n\"License\"). The License shall be deemed effective as of the date CC0 was\napplied by Affirmer to the Work. Should any part of the License for any\nreason be judged legally invalid or ineffective under applicable law, such\npartial invalidity or ineffectiveness shall not invalidate the remainder\nof the License, and in such case Affirmer hereby affirms that he or she\nwill not (i) exercise any of his or her remaining Copyright and Related\nRights in the Work or (ii) assert any associated claims and causes of\naction with respect to the Work, in either case contrary to Affirmer's\nexpress Statement of Purpose.\n\n4. Limitations and Disclaimers.\n\n a. No trademark or patent rights held by Affirmer are waived, abandoned,\n    surrendered, licensed or otherwise affected by this document.\n b. Affirmer offers the Work as-is and makes no representations or\n    warranties of any kind concerning the Work, express, implied,\n    statutory or otherwise, including without limitation warranties of\n    title, merchantability, fitness for a particular purpose, non\n    infringement, or the absence of latent or other defects, accuracy, or\n    the present or absence of errors, whether or not discoverable, all to\n    the greatest extent permissible under applicable law.\n c. Affirmer disclaims responsibility for clearing rights of other persons\n    that may apply to the Work or any use thereof, including without\n    limitation any person's Copyright and Related Rights in the Work.\n    Further, Affirmer disclaims responsibility for obtaining any necessary\n    consents, permissions or other rights required for any use of the\n    Work.\n d. Affirmer understands and acknowledges that Creative Commons is not a\n    party to this document and has no duty or obligation with respect to\n    this CC0 or use of the Work."
  },
  {
    "path": "README.md",
    "content": "# nixpkgs-update\n\n[![Patreon](https://img.shields.io/badge/patreon-donate-blue.svg)](https://www.patreon.com/nixpkgsupdate)\n\nPlease read the [documentation](https://nix-community.github.io/nixpkgs-update/).\n"
  },
  {
    "path": "app/Main.hs",
    "content": "{-# LANGUAGE ExtendedDefaultRules #-}\n{-# LANGUAGE NamedFieldPuns #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# OPTIONS_GHC -fno-warn-type-defaults #-}\n\nmodule Main where\n\nimport Control.Applicative ((<**>))\nimport qualified Data.Text as T\nimport qualified Data.Text.IO as T\nimport DeleteMerged (deleteDone)\nimport Git\nimport qualified GitHub as GH\nimport NVD (withVulnDB)\nimport qualified Nix\nimport qualified Options.Applicative as O\nimport OurPrelude\nimport qualified Repology\nimport System.IO (BufferMode (..), hSetBuffering, stderr, stdout)\nimport qualified System.Posix.Env as P\nimport Update (cveAll, cveReport, sourceGithubAll, updatePackage)\nimport Utils (Options (..), UpdateEnv (..), getGithubToken, getGithubUser)\n\ndefault (T.Text)\n\ndata UpdateOptions = UpdateOptions\n  { pr :: Bool,\n    cve :: Bool,\n    nixpkgsReview :: Bool,\n    outpaths :: Bool,\n    attrpathOpt :: Bool\n  }\n\ndata Command\n  = Update UpdateOptions Text\n  | UpdateBatch UpdateOptions Text\n  | DeleteDone Bool\n  | Version\n  | UpdateVulnDB\n  | CheckAllVulnerable\n  | SourceGithub\n  | FetchRepology\n  | CheckVulnerable Text Text Text\n\nupdateOptionsParser :: O.Parser UpdateOptions\nupdateOptionsParser =\n  UpdateOptions\n    <$> O.flag False True (O.long \"pr\" <> O.help \"Make a pull request using Hub.\")\n    <*> O.flag False True (O.long \"cve\" <> O.help \"Make a CVE vulnerability report.\")\n    <*> O.flag False True (O.long \"nixpkgs-review\" <> O.help \"Runs nixpkgs-review on update commit rev\")\n    <*> O.flag False True (O.long \"outpaths\" <> O.help \"Calculate outpaths to determine the branch to target\")\n    <*> O.flag False True (O.long \"attrpath\" <> O.help \"UPDATE_INFO uses the exact attrpath.\")\n\nupdateParser :: O.Parser Command\nupdateParser =\n  Update\n    <$> updateOptionsParser\n    <*> O.strArgument (O.metavar \"UPDATE_INFO\" <> O.help \"update string of the form: 'pkg oldVer newVer update-page'\\n\\n example: 'tflint 0.15.0 0.15.1 repology.org'\")\n\nupdateBatchParser :: O.Parser Command\nupdateBatchParser =\n  UpdateBatch\n    <$> updateOptionsParser\n    <*> O.strArgument (O.metavar \"UPDATE_INFO\" <> O.help \"update string of the form: 'pkg oldVer newVer update-page'\\n\\n example: 'tflint 0.15.0 0.15.1 repology.org'\")\n\ndeleteDoneParser :: O.Parser Command\ndeleteDoneParser =\n  DeleteDone\n    <$> O.flag False True (O.long \"delete\" <> O.help \"Actually delete the done branches. Otherwise just prints the branches to delete.\")\n\ncommandParser :: O.Parser Command\ncommandParser =\n  O.hsubparser\n    ( O.command\n        \"update\"\n        (O.info (updateParser) (O.progDesc \"Update one package\"))\n        <> O.command\n          \"update-batch\"\n          (O.info (updateBatchParser) (O.progDesc \"Update one package in batch mode.\"))\n        <> O.command\n          \"delete-done\"\n          ( O.info\n              deleteDoneParser\n              (O.progDesc \"Deletes branches from PRs that were merged or closed\")\n          )\n        <> O.command\n          \"version\"\n          ( O.info\n              (pure Version)\n              ( O.progDesc\n                  \"Displays version information for nixpkgs-update and dependencies\"\n              )\n          )\n        <> O.command\n          \"update-vulnerability-db\"\n          ( O.info\n              (pure UpdateVulnDB)\n              (O.progDesc \"Updates the vulnerability database\")\n          )\n        <> O.command\n          \"check-vulnerable\"\n          (O.info checkVulnerable (O.progDesc \"checks if something is vulnerable\"))\n        <> O.command\n          \"check-all-vulnerable\"\n          ( O.info\n              (pure CheckAllVulnerable)\n              (O.progDesc \"checks all packages to update for vulnerabilities\")\n          )\n        <> O.command\n          \"source-github\"\n          (O.info (pure SourceGithub) (O.progDesc \"looks for updates on GitHub\"))\n        <> O.command\n          \"fetch-repology\"\n          (O.info (pure FetchRepology) (O.progDesc \"fetches update from Repology and prints them to stdout\"))\n    )\n\ncheckVulnerable :: O.Parser Command\ncheckVulnerable =\n  CheckVulnerable\n    <$> O.strArgument (O.metavar \"PRODUCT_ID\")\n    <*> O.strArgument (O.metavar \"OLD_VERSION\")\n    <*> O.strArgument (O.metavar \"NEW_VERSION\")\n\nprogramInfo :: O.ParserInfo Command\nprogramInfo =\n  O.info\n    (commandParser <**> O.helper)\n    ( O.fullDesc\n        <> O.progDesc \"Update packages in the Nixpkgs repository\"\n        <> O.header \"nixpkgs-update\"\n    )\n\nmain :: IO ()\nmain = do\n  hSetBuffering stdout LineBuffering\n  hSetBuffering stderr LineBuffering\n  command <- O.execParser programInfo\n  ghUser <- getGithubUser\n  token <- fromMaybe \"\" <$> getGithubToken\n  P.setEnv \"GITHUB_TOKEN\" (T.unpack token) True\n  P.setEnv \"GITHUB_API_TOKEN\" (T.unpack token) True\n  P.setEnv \"PAGER\" \"\" True\n  case command of\n    DeleteDone delete -> do\n      setupNixpkgs $ GH.untagName ghUser\n      deleteDone delete token ghUser\n    Update UpdateOptions {pr, cve, nixpkgsReview, outpaths, attrpathOpt} update -> do\n      setupNixpkgs $ GH.untagName ghUser\n      updatePackage (Options pr False ghUser token cve nixpkgsReview outpaths attrpathOpt) update\n    UpdateBatch UpdateOptions {pr, cve, nixpkgsReview, outpaths, attrpathOpt} update -> do\n      setupNixpkgs $ GH.untagName ghUser\n      updatePackage (Options pr True ghUser token cve nixpkgsReview outpaths attrpathOpt) update\n    Version -> do\n      v <- runExceptT Nix.version\n      case v of\n        Left t -> T.putStrLn (\"error:\" <> t)\n        Right t -> T.putStrLn t\n    UpdateVulnDB -> withVulnDB $ \\_conn -> pure ()\n    CheckAllVulnerable -> do\n      setupNixpkgs $ GH.untagName ghUser\n      updates <- T.readFile \"packages-to-update.txt\"\n      cveAll undefined updates\n    CheckVulnerable productID oldVersion newVersion -> do\n      setupNixpkgs $ GH.untagName ghUser\n      report <-\n        cveReport\n          (UpdateEnv productID oldVersion newVersion Nothing (Options False False ghUser token False False False False))\n      T.putStrLn report\n    SourceGithub -> do\n      updates <- T.readFile \"packages-to-update.txt\"\n      setupNixpkgs $ GH.untagName ghUser\n      sourceGithubAll (Options False False ghUser token False False False False) updates\n    FetchRepology -> Repology.fetch\n"
  },
  {
    "path": "doc/batch-updates.md",
    "content": "# Batch updates {#batch-updates}\n\nnixpkgs-update supports batch updates via the `update-list`\nsubcommand.\n\n## Update-List tutorial\n\n1. Setup [hub](https://github.com/github/hub) and give it your GitHub\n   credentials, so it saves an oauth token. This allows nixpkgs-update\n   to query the GitHub API.  Alternatively, if you prefer not to install\n   and configure `hub`, you can manually create a GitHub token with\n   `repo` and `gist` scopes.  Provide it to `nixpkgs-update` by\n   exporting it as the `GITHUB_TOKEN` environment variable\n   (`nixpkgs-update` reads credentials from the files `hub` uses but\n   no longer uses `hub` itself).\n\n2. Clone this repository and build `nixpkgs-update`:\n    ```bash\n    git clone https://github.com/nix-community/nixpkgs-update && cd nixpkgs-update\n    nix-build\n    ```\n\n3. To test your config, try to update a single package, like this:\n\n   ```bash\n   ./result/bin/nixpkgs-update update \"pkg oldVer newVer update-page\"`\n\n   # Example:\n   ./result/bin/nixpkgs-update update \"tflint 0.15.0 0.15.1 repology.org\"`\n   ```\n\n   replacing `tflint` with the attribute name of the package you actually want\n   to update, and the old version and new version accordingly.\n\n   If this works, you are now setup to hack on `nixpkgs-update`! If\n   you run it with `--pr`, it will actually send a pull request, which\n   looks like this: https://github.com/NixOS/nixpkgs/pull/82465\n\n\n4. If you'd like to send a batch of updates, get a list of outdated packages and\n   place them in a `packages-to-update.txt` file:\n\n  ```bash\n  ./result/bin/nixpkgs-update fetch-repology > packages-to-update.txt\n  ```\n\n  There also exist alternative sources of updates, these include:\n\n   - package updateScript:\n      [passthru.updateScript](https://nixos.org/manual/nixpkgs/unstable/#var-passthru-updateScript)\n   - GitHub releases:\n     [nixpkgs-update-github-releases](https://github.com/synthetica9/nixpkgs-update-github-releases)\n\n5. Run the tool in batch mode with `update-list`:\n\n  ```bash\n  ./result/bin/nixpkgs-update update-list\n  ```\n"
  },
  {
    "path": "doc/contact.md",
    "content": "# Contact {#contact}\n\nGithub: [https://github.com/nix-community/nixpkgs-update](https://github.com/nix-community/nixpkgs-update)\n\nMatrix: [https://matrix.to/#/#nixpkgs-update:nixos.org](https://matrix.to/#/#nixpkgs-update:nixos.org)\n"
  },
  {
    "path": "doc/contributing.md",
    "content": "# Contributing {#contributing}\n\nIncremental development:\n\n```bash\nnix-shell --run \"cabal v2-repl\"\n```\n\nRun the tests:\n\n```bash\nnix-shell --run \"cabal v2-test\"\n```\n\nRun a type checker in the background for quicker type checking feedback:\n\n```bash\nnix-shell --run \"ghcid\"\n```\n\nRun a type checker for the app code:\n\n```bash\nnix-shell --run 'ghcid -c \"cabal v2-repl exe:nixpkgs-update\"'\n```\n\nRun a type checker for the test code:\n\n```bash\nnix-shell --run 'ghcid -c \"cabal v2-repl tests\"'\n```\n\nUpdating the Cabal file when adding new dependencies or options:\n\n```bash\nnix run nixpkgs#haskellPackages.hpack\n```\n\nSource files are formatted with [Ormolu](https://github.com/tweag/ormolu).\n\nThere is also a [Cachix cache](https://nix-community.cachix.org/) available for the dependencies of this program.\n"
  },
  {
    "path": "doc/details.md",
    "content": "# Details {#details}\n\nSome of these features only apply to the update-list sub-command or to\nfeatures only available to the @r-ryantm bot.\n\n## Checks\n\nA number of checks are performed to help nixpkgs maintainers gauge the\nlikelihood that an update was successful. All the binaries are run with\nvarious flags to see if they have a zero exit code and output the new\nversion number. The outpath directory tree is searched for files\ncontaining the new version number. A directory tree and disk usage\nlisting is provided.\n\n\n## Security report\n\nInformation from the National Vulnerability Database maintained by\nNIST is compared against the current and updated package version. The\nnixpkgs package name is matched with the Common Platform Enumeration\nvendor, product, edition, software edition, and target software fields\nto find candidate Common Vulnerabilities and Exposures (CVEs). The\nCVEs are filtered by the matching the current and updated versions\nwith the CVE version ranges.\n\nThe general philosophy of the CVE search is to avoid false negatives,\nwhich means we expect to generate many false positives. The false\npositives can be carefully removed by manually created rules\nimplemented in the filter function in the NVDRules module.\n\nIf there are no CVE matches, the report is not shown. The report has\nthree parts: CVEs resolved by this update, CVEs introduced by this\nupdate, and CVEs present in both version.\n\nIf you would like to report a problem with the security report, please\nuse the [nixpkgs-update GitHub\nissues](https://github.com/nix-community/nixpkgs-update/issues).\n\nThe initial development of the security report was made possible by a\npartnership with [Serokell](https://serokell.io/) and the [NLNet\nFoundation](https://nlnet.nl/) through their [Next Generation Internet\nZero Discovery initiative](https://nlnet.nl/discovery/) (NGI0\nDiscovery). NGI0 Discovery is made possible with financial support\nfrom the [European Commission](https://ec.europa.eu/).\n\n\n## Rebuild report\n\nThe PRs made by nixpkgs-update say what packages need to be rebuilt if\nthe pull request is merged. This uses the same mechanism\n[OfBorg](https://github.com/NixOS/ofborg) uses to put rebuild labels\non PRs. Not limited by labels, it can report the exact number of\nrebuilds and list some of the attrpaths that would need to be rebuilt.\n\n\n## PRs against staging\n\nIf a PR merge would cause more than 500 packages to be rebuilt, the PR\nis made against staging.\n\n\n## Logs\n\n[Logs from r-ryantm's runs](https://nixpkgs-update-logs.nix-community.org/) are\navailable online. There are a lot of packages `nixpkgs-update`\ncurrently has no hope of updating. Please dredge the logs to find out\nwhy your pet package is not receiving updates.\n\n\n## Cache\n\nBy serving the build outputs from\n[https://nixpkgs-update-cache.nix-community.org/](https://nixpkgs-update-cache.nix-community.org/), nixpkgs-update allows you to\ntest a package with one command.\n"
  },
  {
    "path": "doc/donate.md",
    "content": "# Donate {#donate}\n\n[@r-ryantm](https://github.com/r-ryantm), the bot that updates Nixpkgs, is currently running on a Hetzner bare-metal server that costs me €60 per month. Your support in paying for infrastructure would be a great help:\n\n* [GitHub Sponsors](https://github.com/sponsors/ryantm)\n* [Patreon](https://www.patreon.com/nixpkgsupdate)\n"
  },
  {
    "path": "doc/installation.md",
    "content": "# Installation {#installation}\n\n::: note\nFor the Cachix cache to work, your user must be in the trusted-users\nlist or you can use sudo since root is effectively trusted.\n:::\n\nRun without installing on stable Nix:\n\n```ShellSession\n$ nix run \\\n  --option extra-substituters 'https://nix-community.cachix.org/' \\\n  --option extra-trusted-public-keys 'nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=' \\\n  -f https://github.com/nix-community/nixpkgs-update/archive/main.tar.gz \\\n  -c nixpkgs-update --help\n```\n\nRun without installing on unstable Nix with nix command enabled:\n\n```ShellSession\n$ nix shell \\\n  --option extra-substituters 'https://nix-community.cachix.org/' \\\n  --option extra-trusted-public-keys 'nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=' \\\n  -f https://github.com/nix-community/nixpkgs-update/archive/main.tar.gz \\\n  -c nixpkgs-update --help\n```\n\nRun without installing on unstable Nix with nix flakes enabled:\n\n```ShellSession\n$ nix run \\\n  --option extra-substituters 'https://nix-community.cachix.org/' \\\n  --option extra-trusted-public-keys 'nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=' \\\n  github:nix-community/nixpkgs-update -- --help\n```\n\nInstall into your Nix profile:\n\n```ShellSession\n$ nix-env \\\n  --option extra-substituters 'https://nix-community.cachix.org/' \\\n  --option extra-trusted-public-keys 'nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=' \\\n  -if https://github.com/nix-community/nixpkgs-update/archive/main.tar.gz\n```\n\nDeclaratively with [niv](https://github.com/nmattia/niv):\n\n```ShellSession\n$ niv add nix-community/nixpkgs-update\n```\n\nNixOS config with Niv:\n\n```nix\nlet\n  sources = import ./nix/sources.nix;\n  nixpkgs-update = import sources.nixpkgs-update {};\nin\n  environment.systemPackages = [ nixpkgs-update ];\n```\n\nhome-manager config with Niv:\n\n```nix\nlet\n  sources = import ./nix/sources.nix;\n  nixpkgs-update = import sources.nixpkgs-update {};\nin\n  home.packages = [ nixpkgs-update ];\n```\n"
  },
  {
    "path": "doc/interactive-updates.md",
    "content": "# Interactive updates {#interactive-updates}\n\nnixpkgs-update supports interactive, single package updates via the\n`update` subcommand.\n\n# Update tutorial\n\n1. Setup [`hub`](https://github.com/github/hub) and give it your\n   GitHub credentials.  Alternatively, if you prefer not to install\n   and configure `hub`, you can manually create a GitHub token with\n   `repo` and `gist` scopes.  Provide it to `nixpkgs-update` by\n   exporting it as the `GITHUB_TOKEN` environment variable\n   (`nixpkgs-update` reads credentials from the files `hub` uses but\n   no longer uses `hub` itself).\n2. Go to your local checkout of nixpkgs, and **make sure the working\n   directory is clean**. Be on a branch you are okay committing to.\n3. Ensure that there is an Git origin called `upstream` which points to nixpkgs:\n   ```sh\n   git remote add upstream \"https://github.com/NixOS/nixpkgs.git\"\n   ```\n4. Run it like: `nixpkgs-update update \"postman 7.20.0 7.21.2\"`\n   which mean update the package \"postman\" from version 7.20.0\n   to version 7.21.2.\n5. It will run the updater, and, if the update builds, it will commit\n   the update and output a message you could use for a pull request.\n\n# Flags\n\n`--cve`\n\n: adds CVE vulnerability reporting to the PR message. On\n  first invocation with this option, a CVE database is\n  built. Subsequent invocations will be much faster.\n\n`--nixpkgs-review`\n\n: runs [nixpkgs-review](https://github.com/Mic92/nixpkgs-review),\n  which tries to build all the packages that depend on the one being\n  updated and adds a report.\n"
  },
  {
    "path": "doc/introduction.md",
    "content": "# nixpkgs-update {#introduction}\n\n> The future is here; let's evenly distribute it!\n\nThe [nixpkgs-update](https://github.com/nix-community/nixpkgs-update) mission\nis to make [nixpkgs](https://github.com/nixos/nixpkgs) the most\nup-to-date repository of software in the world by the most ridiculous\nmargin possible. [Here's how we are doing so far](https://repology.org/repositories/graphs).\n\nIt provides an interactive tool for automating single package\nupdates. Given a package name, old version, and new version, it\nupdates the version, and fetcher hashes, makes a commit, and\noptionally a pull request. Along the way, it does checks to make sure\nthe update has a baseline quality.\n\nIt is the code used by the GitHub bot\n[@r-ryantm](https://github.com/r-ryantm) to automatically update\nnixpkgs. It uses package repository information from\n[Repology.org](https://repology.org/repository/nix_unstable), the\nGitHub releases API, and the package passthru.updateScript to generate a lists of outdated\npackages.\n"
  },
  {
    "path": "doc/nixpkgs-maintainer-faq.md",
    "content": "# Nixpkgs Maintainer FAQ {#nixpkgs-maintainer-faq}\n\n## @r-ryantm opened a PR for my package, what do I do?\n\nThanks for being a maintainer. Hopefully, @r-ryantm will be able to save you some time!\n\n1. Review the PR diff, making sure this update makes sense\n   - sometimes updates go backward or accidentally use a dev version\n2. Review upstream changelogs and commits\n3. Follow the \"Instructions to test this update\" section of the PR to get the built program on your computer quickly\n4. Make a GitHub Review approving or requesting changes. Include screenshots or other notes as appropriate.\n\n## Why is @r-ryantm not updating my package? {#no-update}\n\nThere are lots of reasons a package might not be updated. You can usually figure out which one is the issue by looking at the [logs](https://nixpkgs-update-logs.nix-community.org/) or by asking the [maintainers](#contact).\n\n### No new version information\n\nr-ryantm gets its new version information from three sources:\n\n* Repology - information from Repology is delayed because it only updates when there is an unstable channel release\n* GitHub releases\n* package passthru.updateScript\n\nIf none of these sources says the package is out of date, it will not attempt to update it.\n\n### Disabling package updates\n\nUpdates can be disabled by adding a comment to the package:\n```\n# nixpkgs-update: no auto update\n```\n[Example in nixpkgs](https://github.com/NixOS/nixpkgs/blob/f2294037ad2b1345c5d9c2df0e81bdb00eab21f3/pkgs/applications/version-management/gitlab/gitlab-pages/default.nix#L7)\n\n### Skiplist\n\nWe maintain a [Skiplist](https://github.com/nix-community/nixpkgs-update/blob/main/src/Skiplist.hs) of different things not to update. It is possible your package is triggering one of the skip criteria.\n\nPython updates are skipped if they cause more than 100 rebuilds.\n\n### Existing Open or Draft PR\n\nIf there is an existing PR with the exact title of `$attrPath: $oldVersion -> $newVersion`, it will not update the package.\n\n### Version not newer\n\nIf Nix's `builtins.compareVersions` does not think the \"new\" version is newer, it will not update.\n\n### Incompatibile with \"Path Pin\"\n\nSome attrpaths have versions appended to the end of them, like `ruby_3_0`, the new version has to be compatible with this \"Path pin\". Here are some examples:\n\n```Haskell\n-- >>> versionCompatibleWithPathPin \"libgit2_0_25\" \"0.25.3\"\n-- True\n--\n-- >>> versionCompatibleWithPathPin \"owncloud90\" \"9.0.3\"\n-- True\n--\n-- >>> versionCompatibleWithPathPin \"owncloud-client\" \"2.4.1\"\n-- True\n--\n-- >>> versionCompatibleWithPathPin \"owncloud90\" \"9.1.3\"\n-- False\n--\n-- >>> versionCompatibleWithPathPin \"nodejs-slim-10_x\" \"11.2.0\"\n-- False\n--\n-- >>> versionCompatibleWithPathPin \"nodejs-slim-10_x\" \"10.12.0\"\n-- True\n```\n\n### Can't find derivation file\n\nIf `nix edit $attrpath` does not open the correct file that contains the version string and fetcher hash, the update will fail.\n\nThis might not work, for example, if a package doesn't have a `meta` attr (at all, or if the package uses a builder function that is discarding the `meta` attr).\n\n### Update already merged\n\nIf the update is already on `master`, `staging`, or `staging-next`, the update will fail.\n\n### Can't find hash or source url\n\nIf the derivation file has no hash or source URL, it will fail.\n\nSince `nixpkgs-update` is trying to read these from `<pkg>.src`, this can also happen if the package's source is something unexpected such as another package. You can set the fallback `originalSrc` attr so that `nixpkgs-update` can find the correct source in cases like this.\n\n### No updateScript and no version\n\nIf the derivation file has no version and no updateScript, it will fail.\n\n### No changes\n\nIf the derivation \"Rewriters\" fail to change the derivation, it will fail.\n\nIf there is no updateScript, and the source url or the hash did not change, it will fail.\n\n### No rebuilds\n\nIf the rewrites didn't cause any derivations to change, it will fail.\n\n### Didn't build\n\nIf after the rewrites, it doesn't build, it will fail.\n"
  },
  {
    "path": "doc/nixpkgs-update.md",
    "content": ""
  },
  {
    "path": "doc/nu.md",
    "content": "# nu\n\n## Schema\n\n## Table `package`\n\n- id : int\n- attrPath : string\n- versionNixpkgsMaster : string\n- versionNixpkgsStaging : string\n- versionNixpkgsStagingNext : string\n- versionRepology : string\n- versionGitHub : string\n- versionGitLab : string\n- versionPypi : string\n- projectRepology : string\n- nixpkgsNameReplogy : string\n- ownerGitHub : string\n- repoGitHub : string\n- ownerGitLab : string\n- repoGitLab : string\n- lastCheckedRepology : timestamp\n- lastCheckedGitHub : timestamp\n- lastCheckedGitLab : timestamp\n- lastCheckedPyPi : timestamp\n- lastCheckedPendingPR : timestamp\n- lastUpdateAttempt : timestamp\n- pendingPR : int\n- pendingPROwner : string\n- pendingPRBranchName : string\n- lastUpdateLog : string\n\n## Table `maintainer-package`\n\n- id : int\n- packageId : int\n- maintainerId : int\n\n## Table `maintainer`\n\n- id : int\n- gitHubName : string\n"
  },
  {
    "path": "doc/r-ryantm.md",
    "content": "# r-ryantm bot {#r-ryantm}\n\n[@r-ryantm](https://github.com/r-ryantm), is a bot account that updates Nixpkgs by making PRs that bump a package to the latest version. It runs on [community-configured infrastructure](https://nix-community.org/update-bot/).\n"
  },
  {
    "path": "doc/toc.md",
    "content": "# nixpkgs-update\n\n* [Introduction](#introduction)\n* [Installation](#installation)\n* [Interactive updates](#interactive-updates)\n* [Batch updates](#batch-updates)\n* [r-ryantm bot](#r-ryantm)\n* [Details](#details)\n* [Contributing](#contributing)\n* [Donate](#donate)\n* [Nixpkgs Maintainer FAQ](#nixpkgs-maintainer-faq)\n* [Contact](#contact)\n"
  },
  {
    "path": "flake.nix",
    "content": "{\n  description = \"update nixpkgs automatically\";\n\n  inputs.mmdoc.url = \"github:ryantm/mmdoc\";\n  inputs.mmdoc.inputs.nixpkgs.follows = \"nixpkgs\";\n\n  inputs.treefmt-nix.url = \"github:numtide/treefmt-nix\";\n  inputs.treefmt-nix.inputs.nixpkgs.follows = \"nixpkgs\";\n\n  inputs.runtimeDeps.url = \"github:NixOS/nixpkgs/nixos-unstable-small\";\n\n  nixConfig.extra-substituters = \"https://nix-community.cachix.org\";\n  nixConfig.extra-trusted-public-keys = \"nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=\";\n\n  outputs = { self, nixpkgs, mmdoc, treefmt-nix, runtimeDeps } @ args:\n    let\n      systems = [ \"x86_64-linux\" \"aarch64-linux\" \"x86_64-darwin\" \"aarch64-darwin\" ];\n      eachSystem = f: nixpkgs.lib.genAttrs systems (system: f nixpkgs.legacyPackages.${system});\n      treefmtEval = eachSystem (pkgs: treefmt-nix.lib.evalModule pkgs {\n        projectRootFile = \".git/config\";\n        programs.ormolu.enable = true;\n      });\n    in\n    {\n      checks.x86_64-linux =\n        let\n          packages = nixpkgs.lib.mapAttrs' (n: nixpkgs.lib.nameValuePair \"package-${n}\") self.packages.x86_64-linux;\n          devShells = nixpkgs.lib.mapAttrs' (n: nixpkgs.lib.nameValuePair \"devShell-${n}\") self.devShells.x86_64-linux;\n        in\n        packages // devShells // {\n          treefmt = treefmtEval.x86_64-linux.config.build.check self;\n        };\n\n      formatter = eachSystem (pkgs: treefmtEval.${pkgs.system}.config.build.wrapper);\n\n      packages.x86_64-linux = import ./pkgs/default.nix (args // { system = \"x86_64-linux\"; });\n      devShells.x86_64-linux.default = self.packages.\"x86_64-linux\".devShell;\n\n      # nix flake check is broken for these when run on x86_64-linux\n      # packages.x86_64-darwin = import ./pkgs/default.nix (args // { system = \"x86_64-darwin\"; });\n      # devShells.x86_64-darwin.default = self.packages.\"x86_64-darwin\".devShell;\n    };\n}\n"
  },
  {
    "path": "nixpkgs-update.cabal",
    "content": "cabal-version: 2.2\n\n-- This file has been generated from package.yaml by hpack version 0.38.2.\n--\n-- see: https://github.com/sol/hpack\n--\n-- hash: 216d241b554fc46ae5c7bcb1605ea89df7993f629b942a8f243338d52cb49301\n\nname:           nixpkgs-update\nversion:        0.4.0\nsynopsis:       Tool for semi-automatic updating of nixpkgs repository\ndescription:    nixpkgs-update provides tools for updating of nixpkgs packages in a semi-automatic way. Mainly, it is used to run the GitHub bot @r-ryantm, but the underlying update mechanisms should be generally useful and in a later version should be exposed as a command-line tool.\ncategory:       Web\nhomepage:       https://github.com/nix-community/nixpkgs-update#readme\nbug-reports:    https://github.com/nix-community/nixpkgs-update/issues\nauthor:         Ryan Mulligan et al.\nmaintainer:     ryan@ryantm.com\ncopyright:      2018-2022 Ryan Mulligan et al.\nlicense:        CC0-1.0\nlicense-file:   LICENSE\nbuild-type:     Simple\nextra-source-files:\n    README.md\n\nsource-repository head\n  type: git\n  location: https://github.com/nix-community/nixpkgs-update\n\nlibrary\n  exposed-modules:\n      Check\n      CVE\n      Data.Hex\n      DeleteMerged\n      File\n      GH\n      Git\n      Nix\n      NixpkgsReview\n      NVD\n      NVDRules\n      OurPrelude\n      Outpaths\n      Process\n      Repology\n      Rewrite\n      Skiplist\n      Update\n      Utils\n      Version\n  hs-source-dirs:\n      src\n  default-extensions:\n      DataKinds\n      FlexibleContexts\n      GADTs\n      LambdaCase\n      PolyKinds\n      RankNTypes\n      ScopedTypeVariables\n      TypeApplications\n      TypeFamilies\n      TypeOperators\n      BlockArguments\n  ghc-options: -Wall -O2 -flate-specialise -fspecialise-aggressively -fplugin=Polysemy.Plugin\n  build-depends:\n      aeson\n    , base >=4.13 && <5\n    , bytestring\n    , conduit\n    , containers\n    , cryptohash-sha256\n    , directory\n    , errors\n    , filepath\n    , github\n    , http-client\n    , http-client-tls\n    , http-conduit\n    , http-types\n    , iso8601-time\n    , lifted-base\n    , mtl\n    , neat-interpolation\n    , optparse-applicative\n    , parsec\n    , parsers\n    , partial-order\n    , polysemy\n    , polysemy-plugin\n    , regex-applicative-text\n    , servant\n    , servant-client\n    , sqlite-simple\n    , template-haskell\n    , temporary\n    , text\n    , th-env\n    , time\n    , transformers\n    , typed-process\n    , unix\n    , unordered-containers\n    , vector\n    , versions\n    , xdg-basedir\n    , zlib\n  default-language: Haskell2010\n\nexecutable nixpkgs-update\n  main-is: Main.hs\n  hs-source-dirs:\n      app\n  default-extensions:\n      DataKinds\n      FlexibleContexts\n      GADTs\n      LambdaCase\n      PolyKinds\n      RankNTypes\n      ScopedTypeVariables\n      TypeApplications\n      TypeFamilies\n      TypeOperators\n      BlockArguments\n  ghc-options: -Wall -O2 -flate-specialise -fspecialise-aggressively -fplugin=Polysemy.Plugin\n  build-depends:\n      aeson\n    , base >=4.13 && <5\n    , bytestring\n    , conduit\n    , containers\n    , cryptohash-sha256\n    , directory\n    , errors\n    , filepath\n    , github\n    , http-client\n    , http-client-tls\n    , http-conduit\n    , http-types\n    , iso8601-time\n    , lifted-base\n    , mtl\n    , neat-interpolation\n    , nixpkgs-update\n    , optparse-applicative\n    , parsec\n    , parsers\n    , partial-order\n    , polysemy\n    , polysemy-plugin\n    , regex-applicative-text\n    , servant\n    , servant-client\n    , sqlite-simple\n    , template-haskell\n    , temporary\n    , text\n    , th-env\n    , time\n    , transformers\n    , typed-process\n    , unix\n    , unordered-containers\n    , vector\n    , versions\n    , xdg-basedir\n    , zlib\n  default-language: Haskell2010\n\ntest-suite spec\n  type: exitcode-stdio-1.0\n  main-is: Spec.hs\n  other-modules:\n      CheckSpec\n      DoctestSpec\n      UpdateSpec\n      UtilsSpec\n  hs-source-dirs:\n      test\n  default-extensions:\n      DataKinds\n      FlexibleContexts\n      GADTs\n      LambdaCase\n      PolyKinds\n      RankNTypes\n      ScopedTypeVariables\n      TypeApplications\n      TypeFamilies\n      TypeOperators\n      BlockArguments\n  ghc-options: -Wall -O2 -flate-specialise -fspecialise-aggressively -fplugin=Polysemy.Plugin\n  build-depends:\n      aeson\n    , base >=4.13 && <5\n    , bytestring\n    , conduit\n    , containers\n    , cryptohash-sha256\n    , directory\n    , doctest\n    , errors\n    , filepath\n    , github\n    , hspec\n    , hspec-discover\n    , http-client\n    , http-client-tls\n    , http-conduit\n    , http-types\n    , iso8601-time\n    , lifted-base\n    , mtl\n    , neat-interpolation\n    , nixpkgs-update\n    , optparse-applicative\n    , parsec\n    , parsers\n    , partial-order\n    , polysemy\n    , polysemy-plugin\n    , regex-applicative-text\n    , servant\n    , servant-client\n    , sqlite-simple\n    , template-haskell\n    , temporary\n    , text\n    , th-env\n    , time\n    , transformers\n    , typed-process\n    , unix\n    , unordered-containers\n    , vector\n    , versions\n    , xdg-basedir\n    , zlib\n  default-language: Haskell2010\n"
  },
  {
    "path": "nixpkgs-update.nix",
    "content": "{ mkDerivation, aeson, base, bytestring, conduit, containers\n, cryptohash-sha256, directory, doctest, errors, filepath, github\n, hspec, hspec-discover, http-client, http-client-tls, http-conduit\n, http-types, iso8601-time, lib, lifted-base, mtl\n, neat-interpolation, optparse-applicative, parsec, parsers\n, partial-order, polysemy, polysemy-plugin, regex-applicative-text\n, servant, servant-client, sqlite-simple, template-haskell\n, temporary, text, th-env, time, transformers, typed-process, unix\n, unordered-containers, vector, versions, xdg-basedir, zlib\n}:\nmkDerivation {\n  pname = \"nixpkgs-update\";\n  version = \"0.4.0\";\n  src = ./.;\n  isLibrary = true;\n  isExecutable = true;\n  libraryHaskellDepends = [\n    aeson base bytestring conduit containers cryptohash-sha256\n    directory errors filepath github http-client http-client-tls\n    http-conduit http-types iso8601-time lifted-base mtl\n    neat-interpolation optparse-applicative parsec parsers\n    partial-order polysemy polysemy-plugin regex-applicative-text\n    servant servant-client sqlite-simple template-haskell temporary\n    text th-env time transformers typed-process unix\n    unordered-containers vector versions xdg-basedir zlib\n  ];\n  executableHaskellDepends = [\n    aeson base bytestring conduit containers cryptohash-sha256\n    directory errors filepath github http-client http-client-tls\n    http-conduit http-types iso8601-time lifted-base mtl\n    neat-interpolation optparse-applicative parsec parsers\n    partial-order polysemy polysemy-plugin regex-applicative-text\n    servant servant-client sqlite-simple template-haskell temporary\n    text th-env time transformers typed-process unix\n    unordered-containers vector versions xdg-basedir zlib\n  ];\n  testHaskellDepends = [\n    aeson base bytestring conduit containers cryptohash-sha256\n    directory doctest errors filepath github hspec hspec-discover\n    http-client http-client-tls http-conduit http-types iso8601-time\n    lifted-base mtl neat-interpolation optparse-applicative parsec\n    parsers partial-order polysemy polysemy-plugin\n    regex-applicative-text servant servant-client sqlite-simple\n    template-haskell temporary text th-env time transformers\n    typed-process unix unordered-containers vector versions xdg-basedir\n    zlib\n  ];\n  testToolDepends = [ hspec-discover ];\n  homepage = \"https://github.com/nix-community/nixpkgs-update#readme\";\n  description = \"Tool for semi-automatic updating of nixpkgs repository\";\n  license = lib.licenses.cc0;\n  mainProgram = \"nixpkgs-update\";\n}\n"
  },
  {
    "path": "package.yaml",
    "content": "name: nixpkgs-update\nversion: 0.4.0\nsynopsis: Tool for semi-automatic updating of nixpkgs repository\ndescription: nixpkgs-update provides tools for updating of nixpkgs\n  packages in a semi-automatic way. Mainly, it is used to run the GitHub\n  bot @r-ryantm, but the underlying update mechanisms should be\n  generally useful and in a later version should be exposed as a\n  command-line tool.\nlicense: CC0-1.0\nauthor: Ryan Mulligan et al.\nmaintainer: ryan@ryantm.com\ncopyright: 2018-2022 Ryan Mulligan et al.\ncategory: Web\nextra-source-files:\n- README.md\n\ngithub: nix-community/nixpkgs-update\n\nghc-options: -Wall -O2 -flate-specialise -fspecialise-aggressively -fplugin=Polysemy.Plugin\n\ndefault-extensions:\n  - DataKinds\n  - FlexibleContexts\n  - GADTs\n  - LambdaCase\n  - PolyKinds\n  - RankNTypes\n  - ScopedTypeVariables\n  - TypeApplications\n  - TypeFamilies\n  - TypeOperators\n  - BlockArguments\n\ndependencies:\n  - aeson\n  - base >= 4.13 && < 5\n  - bytestring\n  - conduit\n  - containers\n  - cryptohash-sha256\n  - directory\n  - errors\n  - filepath\n  - github\n  - http-client\n  - http-client-tls\n  - http-conduit\n  - http-types\n  - iso8601-time\n  - lifted-base\n  - mtl\n  - neat-interpolation\n  - optparse-applicative\n  - parsec\n  - parsers\n  - partial-order\n  - polysemy\n  - polysemy-plugin\n  - regex-applicative-text\n  - servant\n  - servant-client\n  - sqlite-simple\n  - template-haskell\n  - temporary\n  - text\n  - th-env\n  - time\n  - transformers\n  - typed-process\n  - unix\n  - unordered-containers\n  - vector\n  - versions\n  - xdg-basedir\n  - zlib\n\nlibrary:\n  source-dirs: src\n  when:\n    - condition: false\n      other-modules: Paths_nixpkgs_update\n\ntests:\n  spec:\n    main: Spec.hs\n    source-dirs:\n      - test\n    dependencies:\n      - hspec\n      - hspec-discover\n      - nixpkgs-update\n      - doctest\n    when:\n      - condition: false\n        other-modules: Paths_nixpkgs_update\n\nexecutables:\n  nixpkgs-update:\n    source-dirs: app\n    main: Main.hs\n    dependencies:\n      - nixpkgs-update\n    when:\n      - condition: false\n        other-modules: Paths_nixpkgs_update\n"
  },
  {
    "path": "pkgs/default.nix",
    "content": "{ nixpkgs\n, mmdoc\n, runtimeDeps\n, system\n, self\n, ...\n}:\n\nlet\n\n  runtimePkgs = import runtimeDeps { inherit system; };\n\n  pkgs = import nixpkgs { inherit system; config = { allowBroken = true; }; };\n\n  deps = with runtimePkgs; {\n    NIX = nix;\n    GIT = git;\n    TREE = tree;\n    GIST = gist;\n    # TODO: are there more coreutils paths that need locking down?\n    TIMEOUT = coreutils;\n    NIXPKGSREVIEW = nixpkgs-review;\n  };\n\n  drvAttrs = attrs: deps;\n\n  haskellPackages = pkgs.haskellPackages.override {\n    overrides = _: haskellPackages: {\n      polysemy-plugin = pkgs.haskell.lib.dontCheck haskellPackages.polysemy-plugin;\n      polysemy = pkgs.haskell.lib.dontCheck haskellPackages.polysemy;\n      http-api-data = pkgs.haskell.lib.doJailbreak haskellPackages.http-api-data;\n      nixpkgs-update =\n        pkgs.haskell.lib.justStaticExecutables (\n          pkgs.haskell.lib.failOnAllWarnings (\n            pkgs.haskell.lib.disableExecutableProfiling (\n              pkgs.haskell.lib.disableLibraryProfiling (\n                pkgs.haskell.lib.generateOptparseApplicativeCompletion \"nixpkgs-update\" (\n                  (haskellPackages.callPackage ../nixpkgs-update.nix { }).overrideAttrs drvAttrs\n                )\n              )\n            )\n          )\n        );\n    };\n  };\n\n  shell = haskellPackages.shellFor {\n    nativeBuildInputs = with pkgs; [\n      cabal-install\n      ghcid\n      haskellPackages.cabal2nix\n    ];\n    packages = ps: [ ps.nixpkgs-update ];\n    shellHook = ''\n      export ${\n        nixpkgs.lib.concatStringsSep \" \" (\n          nixpkgs.lib.mapAttrsToList (name: value: ''${name}=\"${value}\"'') deps\n        )\n      }\n    '';\n  };\n\n  doc = pkgs.stdenvNoCC.mkDerivation rec {\n    name = \"nixpkgs-update-doc\";\n    src = self;\n    phases = [ \"mmdocPhase\" ];\n    mmdocPhase = \"${mmdoc.packages.${system}.mmdoc}/bin/mmdoc nixpkgs-update $src/doc $out\";\n  };\n\nin\n{\n  nixpkgs-update = haskellPackages.nixpkgs-update;\n  default = haskellPackages.nixpkgs-update;\n  nixpkgs-update-doc = doc;\n  devShell = shell;\n}\n"
  },
  {
    "path": "rust/.envrc",
    "content": "use flake\ndotenv"
  },
  {
    "path": "rust/.gitignore",
    "content": "target\ndb.sqlite\n.direnv"
  },
  {
    "path": "rust/Cargo.toml",
    "content": "[package]\nname = \"nixpkgs-update\"\nversion = \"0.1.0\"\nedition = \"2021\"\n\n[dependencies]\nchrono = \"0.4.26\"\ndiesel = { version = \"2.1.0\", features = [\"sqlite\", \"chrono\"] }\njson = \"0.12.4\"\nureq = \"2.7.1\"\n"
  },
  {
    "path": "rust/diesel.toml",
    "content": "# For documentation on how to configure this file,\n# see https://diesel.rs/guides/configuring-diesel-cli\n\n[print_schema]\nfile = \"src/schema.rs\"\ncustom_type_derives = [\"diesel::query_builder::QueryId\"]\n\n[migrations_directory]\ndir = \"migrations\"\n"
  },
  {
    "path": "rust/flake.nix",
    "content": "{\n  description = \"update nixpkgs automatically\";\n\n  outputs = { self, nixpkgs } @ args: let\n    pkgs = nixpkgs.legacyPackages.x86_64-linux;\n  in {\n    devShells.x86_64-linux.default = pkgs.mkShell {\n      packages = [\n        pkgs.cargo\n        pkgs.clippy\n        pkgs.sqlite\n        pkgs.diesel-cli\n        pkgs.openssl\n        pkgs.pkg-config\n        pkgs.rustfmt\n      ];\n    };\n  };\n}\n"
  },
  {
    "path": "rust/migrations/2023-08-12-152848_create_packages/down.sql",
    "content": "DROP table packages;\n"
  },
  {
    "path": "rust/migrations/2023-08-12-152848_create_packages/up.sql",
    "content": "CREATE TABLE packages (\n    id TEXT PRIMARY KEY NOT NULL\n  , attr_path TEXT NOT NULL\n  , last_update_attempt DATETIME\n  , last_update_log DATETIME\n  , version_nixpkgs_master TEXT\n  , last_checked_nixpkgs_master DATETIME\n  , version_nixpkgs_staging TEXT\n  , last_checked_nixpkgs_staging DATETIME\n  , version_nixpkgs_staging_next TEXT\n  , last_checked_nixpkgs_staging_next DATETIME\n  , version_repology TEXT\n  , project_repology TEXT\n  , nixpkgs_name_replogy TEXT\n  , last_checked_repology DATETIME\n  , version_github TEXT\n  , owner_github TEXT\n  , repo_github TEXT\n  , last_checked_github DATETIME\n  , version_gitlab TEXT\n  , owner_gitlab TEXT\n  , repo_gitlab TEXT\n  , last_checked_gitlab DATETIME\n  , package_name_pypi TEXT\n  , version_pypi TEXT\n  , last_checked_pypi DATETIME\n  , pending_pr INTEGER\n  , pending_pr_owner TEXT\n  , pending_pr_branch_name TEXT\n  , last_checked_pending_pr DATETIME\n)\n"
  },
  {
    "path": "rust/src/github.rs",
    "content": "use crate::nix;\n\nfn token() -> Option<String> {\n    if let Ok(token) = std::env::var(\"GH_TOKEN\") {\n        return Some(token);\n    }\n    if let Ok(token) = std::env::var(\"GITHUB_TOKEN\") {\n        return Some(token);\n    }\n    None\n}\n\npub fn latest_release(github: &Github) -> Result<json::JsonValue, &'static str> {\n    let mut request = ureq::get(&format!(\n        \"https://api.github.com/repos/{}/{}/releases/latest\",\n        github.owner, github.repo,\n    ))\n    .set(\"Accept\", \"application/vnd.github+json\")\n    .set(\"X-GitHub-Api-Version\", \"2022-11-28\");\n\n    if let Some(token) = token() {\n        request = request.set(\"Authorization\", &format!(\"Bearer {}\", token));\n    }\n\n    let body = request.call().unwrap().into_string().unwrap();\n    if let json::JsonValue::Object(response) = json::parse(&body).unwrap() {\n        return Ok(response[\"tag_name\"].clone());\n    }\n    Err(\"Couldn't find\")\n}\n\npub struct Github {\n    pub owner: String,\n    pub repo: String,\n}\n\npub fn from(attr_path: &String) -> Option<Github> {\n    let url = nix::eval(\"master\", attr_path, \"(drv: drv.src.url)\").unwrap();\n    if !url.contains(\"github\") {\n        return None;\n    }\n    let owner = nix::eval(\"master\", attr_path, \"(drv: drv.src.owner)\").unwrap();\n    let repo = nix::eval(\"master\", attr_path, \"(drv: drv.src.repo)\").unwrap();\n    Some(Github {\n        owner: owner.to_string(),\n        repo: repo.to_string(),\n    })\n}\n"
  },
  {
    "path": "rust/src/lib.rs",
    "content": "pub mod models;\npub mod schema;\n\nuse diesel::prelude::*;\nuse std::env;\n\npub fn establish_connection() -> SqliteConnection {\n    let database_url = env::var(\"DATABASE_URL\").expect(\"DATABASE_URL must be set\");\n    SqliteConnection::establish(&database_url)\n        .unwrap_or_else(|_| panic!(\"Error connecting to {}\", database_url))\n}\n"
  },
  {
    "path": "rust/src/main.rs",
    "content": "mod github;\nmod nix;\nmod repology;\n\nuse chrono::offset::Utc;\nuse diesel::prelude::*;\nuse nixpkgs_update::models::*;\nuse nixpkgs_update::*;\n\nfn version_in_nixpkgs_branch(branch: &str, attr_path: &String) -> Option<String> {\n    nix::eval(branch, attr_path, \"(drv: drv.version)\")\n}\n\nfn version_in_nixpkgs_master(attr_path: &String) -> Option<String> {\n    version_in_nixpkgs_branch(\"master\", attr_path)\n}\n\nfn version_in_nixpkgs_staging(attr_path: &String) -> Option<String> {\n    version_in_nixpkgs_branch(\"staging\", attr_path)\n}\n\nfn version_in_nixpkgs_staging_next(attr_path: &String) -> Option<String> {\n    version_in_nixpkgs_branch(\"staging-next\", attr_path)\n}\n\nfn main() {\n    use nixpkgs_update::schema::packages::dsl::*;\n\n    let connection = &mut establish_connection();\n    let results: Vec<Package> = packages.load(connection).expect(\"Error loading packages\");\n\n    println!(\"Displaying {} packages\", results.len());\n    for package in results {\n        println!(\"{} {}\", package.id, package.attr_path);\n\n        let result: String = repology::latest_version(&package.attr_path)\n            .unwrap()\n            .to_string();\n        println!(\"newest repology version {}\", result);\n\n        let version_in_nixpkgs_master: String =\n            version_in_nixpkgs_master(&package.attr_path).unwrap();\n        println!(\"nixpkgs master version {}\", version_in_nixpkgs_master);\n\n        let version_in_nixpkgs_staging: String =\n            version_in_nixpkgs_staging(&package.attr_path).unwrap();\n        println!(\"nixpkgs staging version {}\", version_in_nixpkgs_staging);\n\n        let version_in_nixpkgs_staging_next: String =\n            version_in_nixpkgs_staging_next(&package.attr_path).unwrap();\n        println!(\n            \"nixpkgs staging_next version {}\",\n            version_in_nixpkgs_staging_next\n        );\n\n        let now = Some(Utc::now().naive_utc());\n        diesel::update(packages.find(&package.id))\n            .set((\n                version_nixpkgs_master.eq(Some(version_in_nixpkgs_master)),\n                last_checked_nixpkgs_master.eq(now),\n                version_nixpkgs_staging.eq(Some(version_in_nixpkgs_staging)),\n                last_checked_nixpkgs_staging.eq(now),\n                version_nixpkgs_staging_next.eq(Some(version_in_nixpkgs_staging_next)),\n                last_checked_nixpkgs_staging_next.eq(now),\n                version_repology.eq(Some(result)),\n                last_checked_repology.eq(now),\n            ))\n            .execute(connection)\n            .unwrap();\n\n        if let Some(github) = github::from(&package.attr_path) {\n            println!(\"found github for {}\", package.attr_path);\n\n            let vgithub: String = github::latest_release(&github).unwrap().to_string();\n            println!(\"version github {}\", vgithub);\n            let now = Some(Utc::now().naive_utc());\n            diesel::update(packages.find(&package.id))\n                .set((\n                    version_github.eq(Some(vgithub)),\n                    owner_github.eq(Some(github.owner)),\n                    repo_github.eq(Some(github.repo)),\n                    last_checked_github.eq(now),\n                ))\n                .execute(connection)\n                .unwrap();\n        }\n    }\n}\n"
  },
  {
    "path": "rust/src/models.rs",
    "content": "use chrono::NaiveDateTime;\nuse diesel::prelude::*;\n\n#[derive(Queryable, Selectable)]\n#[diesel(table_name = crate::schema::packages)]\n#[diesel(check_for_backend(diesel::sqlite::Sqlite))]\npub struct Package {\n    pub id: String,\n    pub attr_path: String,\n    pub last_update_attempt: Option<NaiveDateTime>,\n    pub last_update_log: Option<NaiveDateTime>,\n    pub version_nixpkgs_master: Option<String>,\n    pub last_checked_nixpkgs_master: Option<NaiveDateTime>,\n    pub version_nixpkgs_staging: Option<String>,\n    pub last_checked_nixpkgs_staging: Option<NaiveDateTime>,\n    pub version_nixpkgs_staging_next: Option<String>,\n    pub last_checked_nixpkgs_staging_next: Option<NaiveDateTime>,\n    pub project_repology: Option<String>,\n    pub nixpkgs_name_replogy: Option<String>,\n    pub version_repology: Option<String>,\n    pub last_checked_repology: Option<NaiveDateTime>,\n    pub owner_github: Option<String>,\n    pub repo_github: Option<String>,\n    pub version_github: Option<String>,\n    pub last_checked_github: Option<NaiveDateTime>,\n    pub owner_gitlab: Option<String>,\n    pub repo_gitlab: Option<String>,\n    pub version_gitlab: Option<String>,\n    pub last_checked_gitlab: Option<NaiveDateTime>,\n    pub package_name_pypi: Option<String>,\n    pub version_pypi: Option<String>,\n    pub last_checked_pypi: Option<NaiveDateTime>,\n    pub pending_pr: Option<i32>,\n    pub pending_pr_owner: Option<String>,\n    pub pending_pr_branch_name: Option<String>,\n    pub last_checked_pending_pr: Option<NaiveDateTime>,\n}\n"
  },
  {
    "path": "rust/src/nix.rs",
    "content": "use std::process::Command;\n\npub fn eval(branch: &str, attr_path: &String, apply: &str) -> Option<String> {\n    let output = Command::new(\"nix\")\n        .arg(\"eval\")\n        .arg(\"--raw\")\n        .arg(\"--refresh\")\n        .arg(&format!(\"github:nixos/nixpkgs/{}#{}\", branch, attr_path))\n        .arg(\"--apply\")\n        .arg(apply)\n        .output();\n    match output {\n        Ok(output) => Some(String::from_utf8_lossy(&output.stdout).to_string()),\n        Err(_) => None,\n    }\n}\n"
  },
  {
    "path": "rust/src/repology.rs",
    "content": "pub fn latest_version(project_name: &String) -> Result<json::JsonValue, &'static str> {\n    let body = ureq::get(&format!(\n        \"https://repology.org/api/v1/project/{}\",\n        project_name\n    ))\n    .set(\"User-Agent\", \"nixpkgs-update\")\n    .call()\n    .unwrap()\n    .into_string()\n    .unwrap();\n    let json = json::parse(&body).unwrap();\n    if let json::JsonValue::Array(projects) = json {\n        for project in projects {\n            if let json::JsonValue::Object(project_repo) = project {\n                if project_repo[\"status\"] == \"newest\" {\n                    return Ok(project_repo.get(\"version\").unwrap().clone());\n                }\n            }\n        }\n    }\n    Err(\"Couldn't find\")\n}\n"
  },
  {
    "path": "rust/src/schema.rs",
    "content": "// @generated automatically by Diesel CLI.\n\ndiesel::table! {\n    packages (id) {\n        id -> Text,\n        attr_path -> Text,\n        last_update_attempt -> Nullable<Timestamp>,\n        last_update_log -> Nullable<Timestamp>,\n        version_nixpkgs_master -> Nullable<Text>,\n        last_checked_nixpkgs_master -> Nullable<Timestamp>,\n        version_nixpkgs_staging -> Nullable<Text>,\n        last_checked_nixpkgs_staging -> Nullable<Timestamp>,\n        version_nixpkgs_staging_next -> Nullable<Text>,\n        last_checked_nixpkgs_staging_next -> Nullable<Timestamp>,\n        version_repology -> Nullable<Text>,\n        project_repology -> Nullable<Text>,\n        nixpkgs_name_replogy -> Nullable<Text>,\n        last_checked_repology -> Nullable<Timestamp>,\n        version_github -> Nullable<Text>,\n        owner_github -> Nullable<Text>,\n        repo_github -> Nullable<Text>,\n        last_checked_github -> Nullable<Timestamp>,\n        version_gitlab -> Nullable<Text>,\n        owner_gitlab -> Nullable<Text>,\n        repo_gitlab -> Nullable<Text>,\n        last_checked_gitlab -> Nullable<Timestamp>,\n        package_name_pypi -> Nullable<Text>,\n        version_pypi -> Nullable<Text>,\n        last_checked_pypi -> Nullable<Timestamp>,\n        pending_pr -> Nullable<Integer>,\n        pending_pr_owner -> Nullable<Text>,\n        pending_pr_branch_name -> Nullable<Text>,\n        last_checked_pending_pr -> Nullable<Timestamp>,\n    }\n}\n"
  },
  {
    "path": "src/CVE.hs",
    "content": "{-# LANGUAGE NamedFieldPuns #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE RecordWildCards #-}\n\nmodule CVE\n  ( parseFeed,\n    CPE (..),\n    CPEMatch (..),\n    CPEMatchRow (..),\n    cpeMatches,\n    CVE (..),\n    CVEID,\n    cveLI,\n  )\nwhere\n\nimport Data.Aeson\n  ( FromJSON,\n    Key,\n    Object,\n    eitherDecode,\n    parseJSON,\n    withObject,\n    (.!=),\n    (.:),\n    (.:!),\n  )\nimport Data.Aeson.Types (Parser, prependFailure)\nimport qualified Data.ByteString.Lazy.Char8 as BSL\nimport Data.List (intercalate)\nimport qualified Data.Text as T\nimport Data.Time.Clock (UTCTime)\nimport Database.SQLite.Simple (FromRow, ToRow, field, fromRow, toRow)\nimport Database.SQLite.Simple.ToField (toField)\nimport OurPrelude\nimport Utils (Boundary (..), VersionMatcher (..))\n\ntype CVEID = Text\n\ndata CVE = CVE\n  { cveID :: CVEID,\n    cveCPEMatches :: [CPEMatch],\n    cveDescription :: Text,\n    cvePublished :: UTCTime,\n    cveLastModified :: UTCTime\n  }\n  deriving (Show, Eq, Ord)\n\n-- | cve list item\ncveLI :: CVE -> Bool -> Text\ncveLI c patched =\n  \"- [\"\n    <> cveID c\n    <> \"](https://nvd.nist.gov/vuln/detail/\"\n    <> cveID c\n    <> \")\"\n    <> p\n  where\n    p =\n      if patched\n        then \" (patched)\"\n        else \"\"\n\ndata CPEMatch = CPEMatch\n  { cpeMatchCPE :: CPE,\n    cpeMatchVulnerable :: Bool,\n    cpeMatchVersionMatcher :: VersionMatcher\n  }\n  deriving (Show, Eq, Ord)\n\ninstance FromRow CPEMatch where\n  fromRow = do\n    cpeMatchCPE <- fromRow\n    let cpeMatchVulnerable = True\n    cpeMatchVersionMatcher <- field\n    pure CPEMatch {..}\n\n-- This decodes an entire CPE string and related attributes, but we only use\n-- cpeVulnerable, cpeProduct, cpeVersion and cpeMatcher.\ndata CPE = CPE\n  { cpePart :: (Maybe Text),\n    cpeVendor :: (Maybe Text),\n    cpeProduct :: (Maybe Text),\n    cpeVersion :: (Maybe Text),\n    cpeUpdate :: (Maybe Text),\n    cpeEdition :: (Maybe Text),\n    cpeLanguage :: (Maybe Text),\n    cpeSoftwareEdition :: (Maybe Text),\n    cpeTargetSoftware :: (Maybe Text),\n    cpeTargetHardware :: (Maybe Text),\n    cpeOther :: (Maybe Text)\n  }\n  deriving (Eq, Ord)\n\ninstance Show CPE where\n  show\n    CPE\n      { cpePart,\n        cpeVendor,\n        cpeProduct,\n        cpeVersion,\n        cpeUpdate,\n        cpeEdition,\n        cpeLanguage,\n        cpeSoftwareEdition,\n        cpeTargetSoftware,\n        cpeTargetHardware,\n        cpeOther\n      } =\n      \"CPE {\"\n        <> (intercalate \", \" . concat)\n          [ cpeField \"part\" cpePart,\n            cpeField \"vendor\" cpeVendor,\n            cpeField \"product\" cpeProduct,\n            cpeField \"version\" cpeVersion,\n            cpeField \"update\" cpeUpdate,\n            cpeField \"edition\" cpeEdition,\n            cpeField \"language\" cpeLanguage,\n            cpeField \"softwareEdition\" cpeSoftwareEdition,\n            cpeField \"targetSoftware\" cpeTargetSoftware,\n            cpeField \"targetHardware\" cpeTargetHardware,\n            cpeField \"other\" cpeOther\n          ]\n        <> \"}\"\n      where\n        cpeField :: Show a => String -> Maybe a -> [String]\n        cpeField _ Nothing = []\n        cpeField name (Just value) = [name <> \" = \" <> show value]\n\ninstance ToRow CPE where\n  toRow\n    CPE\n      { cpePart,\n        cpeVendor,\n        cpeProduct,\n        cpeVersion,\n        cpeUpdate,\n        cpeEdition,\n        cpeLanguage,\n        cpeSoftwareEdition,\n        cpeTargetSoftware,\n        cpeTargetHardware,\n        cpeOther\n      } =\n      fmap -- There is no toRow instance for a tuple this large\n        toField\n        [ cpePart,\n          cpeVendor,\n          cpeProduct,\n          cpeVersion,\n          cpeUpdate,\n          cpeEdition,\n          cpeLanguage,\n          cpeSoftwareEdition,\n          cpeTargetSoftware,\n          cpeTargetHardware,\n          cpeOther\n        ]\n\ninstance FromRow CPE where\n  fromRow = do\n    cpePart <- field\n    cpeVendor <- field\n    cpeProduct <- field\n    cpeVersion <- field\n    cpeUpdate <- field\n    cpeEdition <- field\n    cpeLanguage <- field\n    cpeSoftwareEdition <- field\n    cpeTargetSoftware <- field\n    cpeTargetHardware <- field\n    cpeOther <- field\n    pure CPE {..}\n\n-- | Parse a @description_data@ subtree and return the concatenation of the\n-- english descriptions.\nparseDescription :: Object -> Parser Text\nparseDescription o = do\n  dData <- o .: \"description_data\"\n  descriptions <-\n    fmap concat $\n      sequence $\n        flip map dData $\n          \\dDatum -> do\n            value <- dDatum .: \"value\"\n            lang :: Text <- dDatum .: \"lang\"\n            pure $\n              case lang of\n                \"en\" -> [value]\n                _ -> []\n  pure $ T.intercalate \"\\n\\n\" descriptions\n\ninstance FromJSON CVE where\n  parseJSON =\n    withObject \"CVE\" $ \\o -> do\n      cve <- o .: \"cve\"\n      meta <- cve .: \"CVE_data_meta\"\n      cveID <- meta .: \"ID\"\n      prependFailure (T.unpack cveID <> \": \") $ do\n        cfgs <- o .: \"configurations\"\n        cveCPEMatches <- parseConfigurations cfgs\n        cvePublished <- o .: \"publishedDate\"\n        cveLastModified <- o .: \"lastModifiedDate\"\n        description <- cve .: \"description\"\n        cveDescription <- parseDescription description\n        pure CVE {..}\n\ninstance ToRow CVE where\n  toRow CVE {cveID, cveDescription, cvePublished, cveLastModified} =\n    toRow (cveID, cveDescription, cvePublished, cveLastModified)\n\ninstance FromRow CVE where\n  fromRow = do\n    let cveCPEMatches = []\n    cveID <- field\n    cveDescription <- field\n    cvePublished <- field\n    cveLastModified <- field\n    pure CVE {..}\n\nsplitCPE :: Text -> [Maybe Text]\nsplitCPE =\n  map (toMaybe . T.replace \"\\a\" \":\") . T.splitOn \":\" . T.replace \"\\\\:\" \"\\a\"\n  where\n    toMaybe \"*\" = Nothing\n    toMaybe x = Just x\n\ninstance FromJSON CPEMatch where\n  parseJSON =\n    withObject \"CPEMatch\" $ \\o -> do\n      t <- o .: \"cpe23Uri\"\n      cpeMatchCPE <-\n        case splitCPE t of\n          [Just \"cpe\", Just \"2.3\", cpePart, cpeVendor, cpeProduct, cpeVersion, cpeUpdate, cpeEdition, cpeLanguage, cpeSoftwareEdition, cpeTargetSoftware, cpeTargetHardware, cpeOther] ->\n            pure CPE {..}\n          _ -> fail $ \"unparsable cpe23Uri: \" <> T.unpack t\n      cpeMatchVulnerable <- o .: \"vulnerable\"\n      vStartIncluding <- o .:! \"versionStartIncluding\"\n      vEndIncluding <- o .:! \"versionEndIncluding\"\n      vStartExcluding <- o .:! \"versionStartExcluding\"\n      vEndExcluding <- o .:! \"versionEndExcluding\"\n      startBoundary <-\n        case (vStartIncluding, vStartExcluding) of\n          (Nothing, Nothing) -> pure Unbounded\n          (Just start, Nothing) -> pure (Including start)\n          (Nothing, Just start) -> pure (Excluding start)\n          (Just _, Just _) -> fail \"multiple version starts\"\n      endBoundary <-\n        case (vEndIncluding, vEndExcluding) of\n          (Nothing, Nothing) -> pure Unbounded\n          (Just end, Nothing) -> pure (Including end)\n          (Nothing, Just end) -> pure (Excluding end)\n          (Just _, Just _) -> fail \"multiple version ends\"\n      cpeMatchVersionMatcher <-\n        case (cpeVersion cpeMatchCPE, startBoundary, endBoundary) of\n          (Just v, Unbounded, Unbounded) -> pure $ SingleMatcher v\n          (Nothing, start, end) -> pure $ RangeMatcher start end\n          _ ->\n            fail\n              ( \"cpe_match has both version \"\n                  <> show (cpeVersion cpeMatchCPE)\n                  <> \" in cpe, and boundaries from \"\n                  <> show startBoundary\n                  <> \" to \"\n                  <> show endBoundary\n              )\n      pure (CPEMatch {..})\n\ndata CPEMatchRow\n  = CPEMatchRow CVE CPEMatch\n\ninstance ToRow CPEMatchRow where\n  toRow (CPEMatchRow CVE {cveID} CPEMatch {cpeMatchCPE, cpeMatchVersionMatcher}) =\n    [toField $ Just cveID]\n      ++ toRow cpeMatchCPE\n      ++ [toField cpeMatchVersionMatcher]\n\ninstance FromRow CPEMatchRow where\n  fromRow = do\n    let cveCPEMatches = []\n    let cveDescription = undefined\n    let cvePublished = undefined\n    let cveLastModified = undefined\n    cveID <- field\n    cpeM <- fromRow\n    pure $ CPEMatchRow (CVE {..}) cpeM\n\ncpeMatches :: [CVE] -> [CPEMatchRow]\ncpeMatches = concatMap rows\n  where\n    rows cve = fmap (CPEMatchRow cve) (cveCPEMatches cve)\n\nguardAttr :: (Eq a, FromJSON a, Show a) => Object -> Key -> a -> Parser ()\nguardAttr object attribute expected = do\n  actual <- object .: attribute\n  unless (actual == expected) $\n    fail $\n      \"unexpected \"\n        <> show attribute\n        <> \", expected \"\n        <> show expected\n        <> \", got \"\n        <> show actual\n\nboundedMatcher :: VersionMatcher -> Bool\nboundedMatcher (RangeMatcher Unbounded Unbounded) = False\nboundedMatcher _ = True\n\n-- Because complex boolean formulas can't be used to determine if a single\n-- product/version is vulnerable, we simply use all leaves marked vulnerable.\nparseNode :: Object -> Parser [CPEMatch]\nparseNode node = do\n  maybeChildren <- node .:! \"children\"\n  case maybeChildren of\n    Nothing -> do\n      matches <- node .:! \"cpe_match\" .!= []\n      pure $\n        filter (cpeMatchVersionMatcher >>> boundedMatcher) $\n          filter cpeMatchVulnerable matches\n    Just children -> do\n      fmap concat $ sequence $ map parseNode children\n\nparseConfigurations :: Object -> Parser [CPEMatch]\nparseConfigurations o = do\n  guardAttr o \"CVE_data_version\" (\"4.0\" :: Text)\n  nodes <- o .: \"nodes\"\n  fmap concat $ sequence $ map parseNode nodes\n\nparseFeed :: BSL.ByteString -> Either Text [CVE]\nparseFeed = bimap T.pack cvefItems . eitherDecode\n\ndata CVEFeed = CVEFeed\n  { cvefItems :: [CVE]\n  }\n\ninstance FromJSON CVEFeed where\n  parseJSON = withObject \"CVEFeed\" $ \\o -> CVEFeed <$> o .: \"CVE_Items\"\n"
  },
  {
    "path": "src/Check.hs",
    "content": "{-# LANGUAGE ExtendedDefaultRules #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE QuasiQuotes #-}\n{-# LANGUAGE TemplateHaskell #-}\n{-# OPTIONS_GHC -fno-warn-type-defaults #-}\n\nmodule Check\n  ( result,\n    -- exposed for testing:\n    hasVersion,\n    versionWithoutPath,\n  )\nwhere\n\nimport Control.Applicative (many)\nimport Data.Char (isDigit, isLetter)\nimport Data.Maybe (fromJust)\nimport qualified Data.Text as T\nimport qualified Data.Text.IO as T\nimport Language.Haskell.TH.Env (envQ)\nimport OurPrelude\nimport System.Exit ()\nimport Text.Regex.Applicative.Text (RE', (=~))\nimport qualified Text.Regex.Applicative.Text as RE\nimport Utils (UpdateEnv (..), nixBuildOptions)\n\ndefault (T.Text)\n\ntreeBin :: String\ntreeBin = fromJust ($$(envQ \"TREE\") :: Maybe String) <> \"/bin/tree\"\n\nprocTree :: [String] -> ProcessConfig () () ()\nprocTree = proc treeBin\n\ngistBin :: String\ngistBin = fromJust ($$(envQ \"GIST\") :: Maybe String) <> \"/bin/gist\"\n\ndata BinaryCheck = BinaryCheck\n  { filePath :: FilePath,\n    zeroExitCode :: Bool,\n    versionPresent :: Bool\n  }\n\nisWordCharacter :: Char -> Bool\nisWordCharacter c = (isDigit c) || (isLetter c)\n\nisNonWordCharacter :: Char -> Bool\nisNonWordCharacter c = not (isWordCharacter c)\n\n-- | Construct regex: /.*\\b${version}\\b.*/s\nversionRegex :: Text -> RE' ()\nversionRegex version =\n  (\\_ -> ())\n    <$> ( (((many RE.anySym) <* (RE.psym isNonWordCharacter)) <|> (RE.pure \"\"))\n            *> (RE.string version)\n            <* ((RE.pure \"\") <|> ((RE.psym isNonWordCharacter) *> (many RE.anySym)))\n        )\n\nhasVersion :: Text -> Text -> Bool\nhasVersion contents expectedVersion =\n  isJust $ contents =~ versionRegex expectedVersion\n\ncheckTestsBuild :: Text -> IO Bool\ncheckTestsBuild attrPath = do\n  let timeout = \"30m\"\n  let args =\n        [T.unpack timeout, \"nix-build\"]\n          ++ nixBuildOptions\n          ++ [ \"-E\",\n               \"{ config }: (import ./. { inherit config; }).\"\n                 ++ (T.unpack attrPath)\n                 ++ \".tests or {}\"\n             ]\n  r <- runExceptT $ ourReadProcessInterleaved $ proc \"timeout\" args\n  case r of\n    Left errorMessage -> do\n      T.putStrLn $ attrPath <> \".tests process failed with output: \" <> errorMessage\n      return False\n    Right (exitCode, output) -> do\n      case exitCode of\n        ExitFailure 124 -> do\n          T.putStrLn $ attrPath <> \".tests took longer than \" <> timeout <> \" and timed out. Other output: \" <> output\n          return False\n        ExitSuccess -> return True\n        _ -> return False\n\ncheckTestsBuildReport :: Bool -> Text\ncheckTestsBuildReport False =\n  \"\\n> [!CAUTION]\\n> A test defined in `passthru.tests` did not pass.\\n\"\ncheckTestsBuildReport True =\n  \"- The tests defined in `passthru.tests`, if any, passed\"\n\nversionWithoutPath :: String -> Text -> String\nversionWithoutPath resultPath expectedVersion =\n  -- We want to match expectedVersion, except when it is preceeded by\n  -- the new store path (as wrappers contain the full store path which\n  -- often includes the version)\n  -- This can be done with negative lookbehind e.g\n  -- /^(?<!${storePathWithoutVersion})${version}/\n  -- Note we also escape the version with \\Q/\\E for grep -P\n  let storePath = fromMaybe (T.pack resultPath) $ T.stripPrefix \"/nix/store/\" (T.pack resultPath)\n   in case T.breakOn expectedVersion storePath of\n        (_, \"\") ->\n          -- no version in prefix, just match version\n          \"\\\\Q\"\n            <> T.unpack expectedVersion\n            <> \"\\\\E\"\n        (storePrefix, _) ->\n          \"(?<!\\\\Q\"\n            <> T.unpack storePrefix\n            <> \"\\\\E)\\\\Q\"\n            <> T.unpack expectedVersion\n            <> \"\\\\E\"\n\nfoundVersionInOutputs :: Text -> String -> IO (Maybe Text)\nfoundVersionInOutputs expectedVersion resultPath =\n  hush\n    <$> runExceptT\n      ( do\n          let regex = versionWithoutPath resultPath expectedVersion\n          (exitCode, _) <-\n            proc \"grep\" [\"-rP\", regex, resultPath]\n              & ourReadProcessInterleaved\n          case exitCode of\n            ExitSuccess ->\n              return $\n                \"- found \"\n                  <> expectedVersion\n                  <> \" with grep in \"\n                  <> T.pack resultPath\n                  <> \"\\n\"\n            _ -> throwE \"grep did not find version in file names\"\n      )\n\nfoundVersionInFileNames :: Text -> String -> IO (Maybe Text)\nfoundVersionInFileNames expectedVersion resultPath =\n  hush\n    <$> runExceptT\n      ( do\n          (_, contents) <-\n            shell (\"find \" <> resultPath) & ourReadProcessInterleaved\n          (contents =~ versionRegex expectedVersion)\n            & hoistMaybe\n            & noteT (T.pack \"Expected version not found\")\n          return $\n            \"- found \"\n              <> expectedVersion\n              <> \" in filename of file in \"\n              <> T.pack resultPath\n              <> \"\\n\"\n      )\n\ntreeGist :: String -> IO (Maybe Text)\ntreeGist resultPath =\n  hush\n    <$> runExceptT\n      ( do\n          contents <- procTree [resultPath] & ourReadProcessInterleavedBS_\n          g <-\n            shell gistBin\n              & setStdin (byteStringInput contents)\n              & ourReadProcessInterleaved_\n          return $ \"- directory tree listing: \" <> g <> \"\\n\"\n      )\n\nduGist :: String -> IO (Maybe Text)\nduGist resultPath =\n  hush\n    <$> runExceptT\n      ( do\n          contents <- proc \"du\" [resultPath] & ourReadProcessInterleavedBS_\n          g <-\n            shell gistBin\n              & setStdin (byteStringInput contents)\n              & ourReadProcessInterleaved_\n          return $ \"- du listing: \" <> g <> \"\\n\"\n      )\n\nresult :: MonadIO m => UpdateEnv -> String -> m Text\nresult updateEnv resultPath =\n  liftIO $ do\n    let expectedVersion = newVersion updateEnv\n    testsBuild <- checkTestsBuild (packageName updateEnv)\n    someReports <-\n      fromMaybe \"\"\n        <$> foundVersionInOutputs expectedVersion resultPath\n          <> foundVersionInFileNames expectedVersion resultPath\n          <> treeGist resultPath\n          <> duGist resultPath\n    return $\n      let testsBuildSummary = checkTestsBuildReport testsBuild\n       in [interpolate|\n              $testsBuildSummary\n              $someReports\n            |]\n"
  },
  {
    "path": "src/Data/Hex.hs",
    "content": "{-# LANGUAGE FlexibleInstances #-}\n{-# LANGUAGE TypeSynonymInstances #-}\n\n-----------------------------------------------------------------------------\n\n-----------------------------------------------------------------------------\n\n-- |\n-- Module      :  Data.Hex\n-- Copyright   :  (c) Taru Karttunen 2009\n-- License     :  BSD-style\n-- Maintainer  :  taruti@taruti.net\n-- Stability   :  provisional\n-- Portability :  portable\n--\n-- Convert strings into hexadecimal and back.\nmodule Data.Hex (Hex (..)) where\n\nimport Control.Monad\nimport qualified Data.ByteString.Char8 as B\nimport qualified Data.ByteString.Lazy.Char8 as L\n\n-- | Convert strings into hexadecimal and back.\nclass Hex t where\n  -- | Convert string into hexadecimal.\n  hex :: t -> t\n\n  -- | Convert from hexadecimal and fail on invalid input.\n  unhex :: MonadFail m => t -> m t\n\ninstance Hex String where\n  hex = Prelude.concatMap w\n    where\n      w ch =\n        let s = \"0123456789ABCDEF\"\n            x = fromEnum ch\n         in [s !! div x 16, s !! mod x 16]\n  unhex [] = return []\n  unhex (a : b : r) = do\n    x <- c a\n    y <- c b\n    liftM (toEnum ((x * 16) + y) :) $ unhex r\n  unhex [_] = fail \"Non-even length\"\n\nc :: MonadFail m => Char -> m Int\nc '0' = return 0\nc '1' = return 1\nc '2' = return 2\nc '3' = return 3\nc '4' = return 4\nc '5' = return 5\nc '6' = return 6\nc '7' = return 7\nc '8' = return 8\nc '9' = return 9\nc 'A' = return 10\nc 'B' = return 11\nc 'C' = return 12\nc 'D' = return 13\nc 'E' = return 14\nc 'F' = return 15\nc 'a' = return 10\nc 'b' = return 11\nc 'c' = return 12\nc 'd' = return 13\nc 'e' = return 14\nc 'f' = return 15\nc _ = fail \"Invalid hex digit!\"\n\ninstance Hex B.ByteString where\n  hex = B.pack . hex . B.unpack\n  unhex x = liftM B.pack $ unhex $ B.unpack x\n\ninstance Hex L.ByteString where\n  hex = L.pack . hex . L.unpack\n  unhex x = liftM L.pack $ unhex $ L.unpack x\n"
  },
  {
    "path": "src/DeleteMerged.hs",
    "content": "{-# LANGUAGE OverloadedStrings #-}\n\nmodule DeleteMerged\n  ( deleteDone,\n  )\nwhere\n\nimport qualified Data.Text.IO as T\nimport qualified GH\nimport qualified Git\nimport GitHub.Data (Name, Owner)\nimport OurPrelude\n\ndeleteDone :: Bool -> Text -> Name Owner -> IO ()\ndeleteDone delete githubToken ghUser = do\n  result <-\n    runExceptT $ do\n      Git.fetch\n      Git.cleanAndResetTo \"master\"\n      refs <- ExceptT $ GH.closedAutoUpdateRefs (GH.authFromToken githubToken) ghUser\n      let branches = fmap (\\r -> (\"auto-update/\" <> r)) refs\n      if delete\n        then liftIO $ Git.deleteBranchesEverywhere branches\n        else liftIO $ do\n          T.putStrLn $ \"Would delete these branches for \" <> tshow ghUser <> \":\"\n          mapM_ (T.putStrLn . tshow) branches\n  case result of\n    Left e -> T.putStrLn e\n    _ -> return ()\n"
  },
  {
    "path": "src/File.hs",
    "content": "{-# LANGUAGE BlockArguments #-}\n{-# LANGUAGE LambdaCase #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE TemplateHaskell #-}\n\nmodule File where\n\nimport qualified Data.Text as T\nimport Data.Text.IO as T\nimport OurPrelude\nimport Polysemy.Input\nimport Polysemy.Output\n\ndata File m a where\n  Read :: FilePath -> File m Text\n  Write :: FilePath -> Text -> File m ()\n\nmakeSem ''File\n\nrunIO ::\n  Member (Embed IO) r =>\n  Sem (File ': r) a ->\n  Sem r a\nrunIO =\n  interpret $ \\case\n    Read file -> embed $ T.readFile file\n    Write file contents -> embed $ T.writeFile file contents\n\nrunPure ::\n  [Text] ->\n  Sem (File ': r) a ->\n  Sem r ([Text], a)\nrunPure contentList =\n  runOutputMonoid pure\n    . runInputList contentList\n    . reinterpret2 \\case\n      Read _file -> maybe \"\" id <$> input\n      Write _file contents -> output contents\n\nreplace ::\n  Member File r =>\n  Text ->\n  Text ->\n  FilePath ->\n  Sem r Bool\nreplace find replacement file = do\n  contents <- File.read file\n  let newContents = T.replace find replacement contents\n  when (contents /= newContents) $ do\n    File.write file newContents\n  return $ contents /= newContents\n\nreplaceIO :: MonadIO m => Text -> Text -> FilePath -> m Bool\nreplaceIO find replacement file =\n  liftIO $\n    runFinal $\n      embedToFinal $\n        runIO $\n          (replace find replacement file)\n"
  },
  {
    "path": "src/GH.hs",
    "content": "{-# LANGUAGE ExtendedDefaultRules #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE QuasiQuotes #-}\n{-# LANGUAGE ScopedTypeVariables #-}\n{-# LANGUAGE TupleSections #-}\n{-# OPTIONS_GHC -fno-warn-type-defaults #-}\n\nmodule GH\n  ( releaseUrl,\n    GH.untagName,\n    authFromToken,\n    checkExistingUpdatePR,\n    closedAutoUpdateRefs,\n    compareUrl,\n    latestVersion,\n    pr,\n    prUpdate,\n  )\nwhere\n\nimport Control.Applicative (liftA2, some)\nimport Data.Aeson (FromJSON)\nimport Data.Bitraversable (bitraverse)\nimport qualified Data.Text as T\nimport qualified Data.Text.Encoding as T\nimport Data.Time.Clock (addUTCTime, getCurrentTime)\nimport qualified Data.Vector as V\nimport qualified Git\nimport qualified GitHub as GH\nimport GitHub.Data.Name (Name (..))\nimport Network.HTTP.Client (HttpException (..), HttpExceptionContent (..), responseStatus)\nimport Network.HTTP.Types.Status (statusCode)\nimport OurPrelude\nimport Text.Regex.Applicative.Text ((=~))\nimport qualified Text.Regex.Applicative.Text as RE\nimport Utils (UpdateEnv (..), Version)\nimport qualified Utils as U\n\ndefault (T.Text)\n\ngReleaseUrl :: MonadIO m => GH.Auth -> URLParts -> ExceptT Text m Text\ngReleaseUrl auth (URLParts o r t) =\n  ExceptT $\n    bimap (T.pack . show) (GH.getUrl . GH.releaseHtmlUrl)\n      <$> liftIO (GH.github auth (GH.releaseByTagNameR o r t))\n\nreleaseUrl :: MonadIO m => UpdateEnv -> Text -> ExceptT Text m Text\nreleaseUrl env url = do\n  urlParts <- parseURL url\n  gReleaseUrl (authFrom env) urlParts\n\npr :: MonadIO m => UpdateEnv -> Text -> Text -> Text -> Text -> ExceptT Text m (Bool, Text)\npr env title body prHead base = do\n  tryPR `catchE` \\case\n    -- If creating the PR returns a 422, most likely cause is that the\n    -- branch was deleted, so push it again and retry once.\n    GH.HTTPError (HttpExceptionRequest _ (StatusCodeException r _))\n      | statusCode (responseStatus r) == 422 ->\n          Git.push env >> withExceptT (T.pack . show) tryPR\n    e ->\n      throwE . T.pack . show $ e\n  where\n    tryPR =\n      ExceptT $\n        fmap ((False,) . GH.getUrl . GH.pullRequestUrl)\n          <$> ( liftIO $\n                  ( GH.github\n                      (authFrom env)\n                      ( GH.createPullRequestR\n                          (N \"nixos\")\n                          (N \"nixpkgs\")\n                          (GH.CreatePullRequest title body prHead base)\n                      )\n                  )\n              )\n\nprUpdate :: forall m. MonadIO m => UpdateEnv -> Text -> Text -> Text -> Text -> ExceptT Text m (Bool, Text)\nprUpdate env title body prHead base = do\n  let runRequest :: FromJSON a => GH.Request k a -> ExceptT Text m a\n      runRequest = ExceptT . fmap (first (T.pack . show)) . liftIO . GH.github (authFrom env)\n  let inNixpkgs f = f (N \"nixos\") (N \"nixpkgs\")\n\n  prs <-\n    runRequest $\n      inNixpkgs GH.pullRequestsForR (GH.optionsHead prHead) GH.FetchAll\n\n  case V.toList prs of\n    [] -> pr env title body prHead base\n    (_ : _ : _) -> throwE $ \"Too many open PRs from \" <> prHead\n    [thePR] -> do\n      let withExistingPR :: (GH.Name GH.Owner -> GH.Name GH.Repo -> GH.IssueNumber -> a) -> a\n          withExistingPR f = inNixpkgs f (GH.simplePullRequestNumber thePR)\n\n      _ <-\n        runRequest $\n          withExistingPR GH.updatePullRequestR $\n            GH.EditPullRequest (Just title) Nothing Nothing Nothing Nothing\n\n      _ <-\n        runRequest $\n          withExistingPR GH.createCommentR body\n\n      return (True, GH.getUrl $ GH.simplePullRequestUrl thePR)\n\ndata URLParts = URLParts\n  { owner :: GH.Name GH.Owner,\n    repo :: GH.Name GH.Repo,\n    tag :: Text\n  }\n  deriving (Show)\n\n-- | Parse owner-repo-branch triplet out of URL\n-- We accept URLs pointing to uploaded release assets\n-- that are usually obtained with fetchurl, as well\n-- as the generated archives that fetchFromGitHub downloads.\n--\n-- Examples:\n--\n-- >>> parseURLMaybe \"https://github.com/blueman-project/blueman/releases/download/2.0.7/blueman-2.0.7.tar.xz\"\n-- Just (URLParts {owner = N \"blueman-project\", repo = N \"blueman\", tag = \"2.0.7\"})\n--\n-- >>> parseURLMaybe \"https://github.com/arvidn/libtorrent/archive/libtorrent_1_1_11.tar.gz\"\n-- Just (URLParts {owner = N \"arvidn\", repo = N \"libtorrent\", tag = \"libtorrent_1_1_11\"})\n--\n-- >>> parseURLMaybe \"https://gitlab.com/inkscape/lib2geom/-/archive/1.0/lib2geom-1.0.tar.gz\"\n-- Nothing\nparseURLMaybe :: Text -> Maybe URLParts\nparseURLMaybe url =\n  let domain = RE.string \"https://github.com/\"\n      slash = RE.sym '/'\n      pathSegment = T.pack <$> some (RE.psym (/= '/'))\n      extension = RE.string \".zip\" <|> RE.string \".tar.gz\"\n      toParts n o = URLParts (N n) (N o)\n      regex =\n        ( toParts\n            <$> (domain *> pathSegment)\n            <* slash\n            <*> pathSegment\n            <*> (RE.string \"/releases/download/\" *> pathSegment)\n            <* slash\n            <* pathSegment\n        )\n          <|> ( toParts\n                  <$> (domain *> pathSegment)\n                  <* slash\n                  <*> pathSegment\n                  <*> (RE.string \"/archive/\" *> pathSegment)\n                  <* extension\n              )\n   in url =~ regex\n\nparseURL :: MonadIO m => Text -> ExceptT Text m URLParts\nparseURL url =\n  tryJust (\"GitHub: \" <> url <> \" is not a GitHub URL.\") (parseURLMaybe url)\n\ncompareUrl :: MonadIO m => Text -> Text -> ExceptT Text m Text\ncompareUrl urlOld urlNew = do\n  oldParts <- parseURL urlOld\n  newParts <- parseURL urlNew\n  return $\n    \"https://github.com/\"\n      <> GH.untagName (owner newParts)\n      <> \"/\"\n      <> GH.untagName (repo newParts)\n      <> \"/compare/\"\n      <> tag oldParts\n      <> \"...\"\n      <> tag newParts\n\nautoUpdateRefs :: GH.Auth -> GH.Name GH.Owner -> IO (Either Text (Vector (Text, GH.Name GH.GitCommit)))\nautoUpdateRefs auth ghUser =\n  GH.github auth (GH.referencesR ghUser \"nixpkgs\" GH.FetchAll)\n    & ((fmap . fmapL) tshow)\n    & ((fmap . fmapR) (fmap (liftA2 (,) (GH.gitReferenceRef >>> GH.untagName) (GH.gitReferenceObject >>> GH.gitObjectSha >>> N)) >>> V.mapMaybe (bitraverse (T.stripPrefix prefix) pure)))\n  where\n    prefix = \"refs/heads/auto-update/\"\n\nopenPRWithAutoUpdateRefFrom :: GH.Auth -> GH.Name GH.Owner -> Text -> IO (Either Text Bool)\nopenPRWithAutoUpdateRefFrom auth ghUser ref =\n  GH.executeRequest\n    auth\n    ( GH.pullRequestsForR\n        \"nixos\"\n        \"nixpkgs\"\n        (GH.optionsHead (GH.untagName ghUser <> \":\" <> U.branchPrefix <> ref) <> GH.stateOpen)\n        GH.FetchAll\n    )\n    <&> bimap (T.pack . show) (not . V.null)\n\ncommitIsOldEnoughToDelete :: GH.Auth -> GH.Name GH.Owner -> GH.Name GH.GitCommit -> IO Bool\ncommitIsOldEnoughToDelete auth ghUser sha = do\n  now <- getCurrentTime\n  let cutoff = addUTCTime (-30 * 60) now\n  GH.executeRequest auth (GH.gitCommitR ghUser \"nixpkgs\" sha)\n    <&> either (const False) ((< cutoff) . GH.gitUserDate . GH.gitCommitCommitter)\n\nrefShouldBeDeleted :: GH.Auth -> GH.Name GH.Owner -> (Text, GH.Name GH.GitCommit) -> IO Bool\nrefShouldBeDeleted auth ghUser (ref, sha) =\n  liftA2\n    (&&)\n    (either (const False) not <$> openPRWithAutoUpdateRefFrom auth ghUser ref)\n    (commitIsOldEnoughToDelete auth ghUser sha)\n\nclosedAutoUpdateRefs :: GH.Auth -> GH.Name GH.Owner -> IO (Either Text (Vector Text))\nclosedAutoUpdateRefs auth ghUser =\n  runExceptT $ do\n    aur :: Vector (Text, GH.Name GH.GitCommit) <- ExceptT $ GH.autoUpdateRefs auth ghUser\n    ExceptT (Right . V.map fst <$> V.filterM (refShouldBeDeleted auth ghUser) aur)\n\nauthFromToken :: Text -> GH.Auth\nauthFromToken = GH.OAuth . T.encodeUtf8\n\nauthFrom :: UpdateEnv -> GH.Auth\nauthFrom = authFromToken . U.githubToken . options\n\ncheckExistingUpdatePR :: MonadIO m => UpdateEnv -> Text -> ExceptT Text m ()\ncheckExistingUpdatePR env attrPath = do\n  searchResult <-\n    ExceptT $\n      liftIO $\n        (GH.github (authFrom env) (GH.searchIssuesR search) GH.FetchAll)\n          & fmap (first (T.pack . show))\n  if T.length (openPRReport searchResult) == 0\n    then return ()\n    else\n      throwE\n        ( \"There might already be an open PR for this update:\\n\"\n            <> openPRReport searchResult\n        )\n  where\n    title = U.prTitle env attrPath\n    search = [interpolate|repo:nixos/nixpkgs $title |]\n    openPRReport searchResult =\n      GH.searchResultResults searchResult\n        & V.filter (GH.issueClosedAt >>> isNothing)\n        & V.filter (GH.issuePullRequest >>> isJust)\n        & fmap report\n        & V.toList\n        & T.unlines\n    report i = \"- \" <> GH.issueTitle i <> \"\\n  \" <> tshow (GH.issueUrl i)\n\nlatestVersion :: MonadIO m => UpdateEnv -> Text -> ExceptT Text m Version\nlatestVersion env url = do\n  urlParts <- parseURL url\n  r <-\n    fmapLT tshow $\n      ExceptT $\n        liftIO $\n          GH.executeRequest (authFrom env) $\n            GH.latestReleaseR (owner urlParts) (repo urlParts)\n  return $ T.dropWhile (\\c -> c == 'v' || c == 'V') (GH.releaseTagName r)\n"
  },
  {
    "path": "src/Git.hs",
    "content": "{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE TemplateHaskell #-}\n\nmodule Git\n  ( findAutoUpdateBranchMessage,\n    mergeBase,\n    cleanAndResetTo,\n    commit,\n    deleteBranchesEverywhere,\n    delete1,\n    diff,\n    diffFileNames,\n    fetch,\n    fetchIfStale,\n    headRev,\n    push,\n    nixpkgsDir,\n    setupNixpkgs,\n    Git.show,\n    worktreeAdd,\n    worktreeRemove,\n  )\nwhere\n\nimport Control.Concurrent\nimport Control.Exception\nimport qualified Data.ByteString as BS\nimport qualified Data.ByteString.Lazy as BSL\nimport Data.Maybe (fromJust)\nimport qualified Data.Text as T\nimport qualified Data.Text.IO as T\nimport Data.Time.Clock (addUTCTime, getCurrentTime)\nimport qualified Data.Vector as V\nimport Language.Haskell.TH.Env (envQ)\nimport OurPrelude hiding (throw)\nimport System.Directory (doesDirectoryExist, doesFileExist, getCurrentDirectory, getModificationTime, setCurrentDirectory)\nimport System.Environment.XDG.BaseDir (getUserCacheDir)\nimport System.Exit ()\nimport System.IO.Error (tryIOError)\nimport System.Posix.Env (setEnv)\nimport Utils (Options (..), UpdateEnv (..), branchName, branchPrefix)\n\nbin :: String\nbin = fromJust ($$(envQ \"GIT\") :: Maybe String) <> \"/bin/git\"\n\nprocGit :: [String] -> ProcessConfig () () ()\nprocGit = proc bin\n\nclean :: ProcessConfig () () ()\nclean = silently $ procGit [\"clean\", \"-fdx\"]\n\nworktreeAdd :: FilePath -> Text -> UpdateEnv -> IO ()\nworktreeAdd path commitish updateEnv =\n  runProcessNoIndexIssue_IO $ silently $ procGit [\"worktree\", \"add\", \"-b\", T.unpack (branchName updateEnv), path, T.unpack commitish]\n\nworktreeRemove :: FilePath -> IO ()\nworktreeRemove path = do\n  exist <- doesDirectoryExist path\n  if exist\n    then runProcessNoIndexIssue_IO $ silently $ procGit [\"worktree\", \"remove\", \"--force\", path]\n    else return ()\n\ncheckout :: Text -> Text -> ProcessConfig () () ()\ncheckout branch target =\n  silently $ procGit [\"checkout\", \"-B\", T.unpack branch, T.unpack target]\n\nreset :: Text -> ProcessConfig () () ()\nreset target = silently $ procGit [\"reset\", \"--hard\", T.unpack target]\n\ndelete1 :: Text -> IO ()\ndelete1 bName = ignoreExitCodeException $ runProcessNoIndexIssue_IO (delete1' bName)\n\ndelete1' :: Text -> ProcessConfig () () ()\ndelete1' branch = delete [branch]\n\ndelete :: [Text] -> ProcessConfig () () ()\ndelete branches = silently $ procGit ([\"branch\", \"-D\"] ++ fmap T.unpack branches)\n\ndeleteOrigin :: [Text] -> ProcessConfig () () ()\ndeleteOrigin branches =\n  silently $ procGit ([\"push\", \"origin\", \"--delete\"] ++ fmap T.unpack branches)\n\ncleanAndResetTo :: MonadIO m => Text -> ExceptT Text m ()\ncleanAndResetTo branch =\n  let target = \"upstream/\" <> branch\n   in do\n        runProcessNoIndexIssue_ $ silently $ procGit [\"reset\", \"--hard\"]\n        runProcessNoIndexIssue_ clean\n        runProcessNoIndexIssue_ $ checkout branch target\n        runProcessNoIndexIssue_ $ reset target\n        runProcessNoIndexIssue_ clean\n\nshow :: MonadIO m => Text -> Text -> ExceptT Text m Text\nshow branch file =\n  readProcessInterleavedNoIndexIssue_ $ silently $ procGit [\"show\", T.unpack (\"remotes/upstream/\" <> branch <> \":\" <> file)]\n\ndiff :: MonadIO m => Text -> ExceptT Text m Text\ndiff branch = readProcessInterleavedNoIndexIssue_ $ procGit [\"diff\", T.unpack branch]\n\ndiffFileNames :: MonadIO m => Text -> ExceptT Text m [Text]\ndiffFileNames branch =\n  readProcessInterleavedNoIndexIssue_ (procGit [\"diff\", T.unpack branch, \"--name-only\"])\n    & fmapRT T.lines\n\nstaleFetchHead :: MonadIO m => m Bool\nstaleFetchHead =\n  liftIO $ do\n    nixpkgsGit <- getUserCacheDir \"nixpkgs\"\n    let fetchHead = nixpkgsGit <> \"/.git/FETCH_HEAD\"\n    oneHourAgo <- addUTCTime (fromInteger $ -60 * 60) <$> getCurrentTime\n    e <- tryIOError $ getModificationTime fetchHead\n    if isLeft e\n      then do\n        return True\n      else do\n        fetchedLast <- getModificationTime fetchHead\n        return (fetchedLast < oneHourAgo)\n\nfetchIfStale :: MonadIO m => ExceptT Text m ()\nfetchIfStale = whenM staleFetchHead fetch\n\nfetch :: MonadIO m => ExceptT Text m ()\nfetch =\n  runProcessNoIndexIssue_ $\n    silently $\n      procGit [\"fetch\", \"-q\", \"--prune\", \"--multiple\", \"upstream\", \"origin\"]\n\npush :: MonadIO m => UpdateEnv -> ExceptT Text m ()\npush updateEnv =\n  runProcessNoIndexIssue_\n    ( procGit\n        ( [ \"push\",\n            \"--force\",\n            \"--set-upstream\",\n            \"origin\",\n            T.unpack (branchName updateEnv)\n          ]\n            ++ [\"--dry-run\" | not (doPR (options updateEnv))]\n        )\n    )\n\nnixpkgsDir :: IO FilePath\nnixpkgsDir = do\n  inNixpkgs <- inNixpkgsRepo\n  if inNixpkgs\n    then getCurrentDirectory\n    else getUserCacheDir \"nixpkgs\"\n\n-- Setup a Nixpkgs clone in $XDG_CACHE_DIR/nixpkgs\n-- Since we are going to have to fetch, git reset, clean, and commit, we setup a\n-- cache dir to avoid destroying any uncommitted work the user may have in PWD.\nsetupNixpkgs :: Text -> IO ()\nsetupNixpkgs ghUser = do\n  fp <- nixpkgsDir\n  exists <- doesDirectoryExist fp\n  unless exists $ do\n    procGit [\"clone\", \"--origin\", \"upstream\", \"https://github.com/NixOS/nixpkgs.git\", fp]\n      & runProcess_\n    setCurrentDirectory fp\n    procGit [\"remote\", \"add\", \"origin\", \"https://github.com/\" <> T.unpack ghUser <> \"/nixpkgs.git\"]\n      -- requires that user has forked nixpkgs\n      & runProcess_\n  inNixpkgs <- inNixpkgsRepo\n  unless inNixpkgs do\n    setCurrentDirectory fp\n    _ <- runExceptT fetchIfStale\n    _ <- runExceptT $ cleanAndResetTo \"master\"\n    return ()\n  System.Posix.Env.setEnv \"NIX_PATH\" (\"nixpkgs=\" <> fp) True\n\nmergeBase :: IO Text\nmergeBase = do\n  readProcessInterleavedNoIndexIssue_IO\n    (procGit [\"merge-base\", \"upstream/master\", \"upstream/staging\"])\n    & fmap T.strip\n\n-- Return Nothing if a remote branch for this package doesn't exist. If a\n-- branch does exist, return a Just of its last commit message.\nfindAutoUpdateBranchMessage :: MonadIO m => Text -> ExceptT Text m (Maybe Text)\nfindAutoUpdateBranchMessage pName = do\n  remoteBranches <-\n    readProcessInterleavedNoIndexIssue_ (procGit [\"branch\", \"--remote\", \"--format=%(refname:short) %(subject)\"])\n      & fmapRT (T.lines >>> fmap (T.strip >>> T.breakOn \" \"))\n  return $\n    lookup (\"origin/\" <> branchPrefix <> pName) remoteBranches\n      & fmap (T.drop 1)\n\ninNixpkgsRepo :: IO Bool\ninNixpkgsRepo = do\n  currentDir <- getCurrentDirectory\n  doesFileExist (currentDir <> \"/nixos/release.nix\")\n\ncommit :: MonadIO m => Text -> ExceptT Text m ()\ncommit ref =\n  runProcessNoIndexIssue_ (procGit [\"commit\", \"-am\", T.unpack ref])\n\nheadRev :: MonadIO m => ExceptT Text m Text\nheadRev = T.strip <$> readProcessInterleavedNoIndexIssue_ (procGit [\"rev-parse\", \"HEAD\"])\n\ndeleteBranchesEverywhere :: Vector Text -> IO ()\ndeleteBranchesEverywhere branches = do\n  let branchList = V.toList $ branches\n  if null branchList\n    then return ()\n    else do\n      result <- runExceptT $ runProcessNoIndexIssue_ (delete branchList)\n      case result of\n        Left error1 -> T.putStrLn $ tshow error1\n        Right success1 -> T.putStrLn $ tshow success1\n      result2 <- runExceptT $ runProcessNoIndexIssue_ (deleteOrigin branchList)\n      case result2 of\n        Left error2 -> T.putStrLn $ tshow error2\n        Right success2 -> T.putStrLn $ tshow success2\n\nrunProcessNoIndexIssue_IO ::\n  ProcessConfig () () () -> IO ()\nrunProcessNoIndexIssue_IO config = go\n  where\n    go = do\n      (code, out, e) <- readProcess config\n      case code of\n        ExitFailure 128\n          | \"index.lock\" `BS.isInfixOf` BSL.toStrict e -> do\n              threadDelay 100000\n              go\n        ExitSuccess -> return ()\n        ExitFailure _ -> throw $ ExitCodeException code config out e\n\nrunProcessNoIndexIssue_ ::\n  MonadIO m => ProcessConfig () () () -> ExceptT Text m ()\nrunProcessNoIndexIssue_ config = tryIOTextET go\n  where\n    go = do\n      (code, out, e) <- readProcess config\n      case code of\n        ExitFailure 128\n          | \"index.lock\" `BS.isInfixOf` BSL.toStrict e -> do\n              threadDelay 100000\n              go\n        ExitSuccess -> return ()\n        ExitFailure _ -> throw $ ExitCodeException code config out e\n\nreadProcessInterleavedNoIndexIssue_ ::\n  MonadIO m => ProcessConfig () () () -> ExceptT Text m Text\nreadProcessInterleavedNoIndexIssue_ config = tryIOTextET go\n  where\n    go = do\n      (code, out) <- readProcessInterleaved config\n      case code of\n        ExitFailure 128\n          | \"index.lock\" `BS.isInfixOf` BSL.toStrict out -> do\n              threadDelay 100000\n              go\n        ExitSuccess -> return $ bytestringToText out\n        ExitFailure _ -> throw $ ExitCodeException code config out out\n\nreadProcessInterleavedNoIndexIssue_IO ::\n  ProcessConfig () () () -> IO Text\nreadProcessInterleavedNoIndexIssue_IO config = go\n  where\n    go = do\n      (code, out) <- readProcessInterleaved config\n      case code of\n        ExitFailure 128\n          | \"index.lock\" `BS.isInfixOf` BSL.toStrict out -> do\n              threadDelay 100000\n              go\n        ExitSuccess -> return $ bytestringToText out\n        ExitFailure _ -> throw $ ExitCodeException code config out out\n"
  },
  {
    "path": "src/NVD.hs",
    "content": "{-# LANGUAGE NamedFieldPuns #-}\n{-# LANGUAGE OverloadedStrings #-}\n\nmodule NVD\n  ( withVulnDB,\n    getCVEs,\n    Connection,\n    ProductID,\n    Version,\n    CVE,\n    CVEID,\n    UTCTime,\n  )\nwhere\n\nimport CVE\n  ( CPEMatch (..),\n    CPEMatchRow (..),\n    CVE (..),\n    CVEID,\n    cpeMatches,\n    parseFeed,\n  )\nimport Codec.Compression.GZip (decompress)\nimport Control.Exception (SomeException, try)\nimport Crypto.Hash.SHA256 (hashlazy)\nimport qualified Data.ByteString.Lazy.Char8 as BSL\nimport Data.Hex (hex, unhex)\nimport Data.List (group)\nimport qualified Data.Text as T\nimport Data.Time.Calendar (toGregorian)\nimport Data.Time.Clock\n  ( UTCTime,\n    diffUTCTime,\n    getCurrentTime,\n    nominalDay,\n    utctDay,\n  )\nimport Data.Time.ISO8601 (parseISO8601)\nimport Database.SQLite.Simple\n  ( Connection,\n    Only (..),\n    Query (..),\n    execute,\n    executeMany,\n    execute_,\n    query,\n    withConnection,\n    withTransaction,\n  )\nimport qualified NVDRules\nimport Network.HTTP.Conduit (simpleHttp)\nimport OurPrelude\nimport System.Directory\n  ( XdgDirectory (..),\n    createDirectoryIfMissing,\n    getXdgDirectory,\n    removeFile,\n  )\nimport Utils (ProductID, Version)\nimport Version (matchVersion)\n\n-- | Either @recent@, @modified@, or any year since @2002@.\ntype FeedID = String\n\ntype Extension = String\n\ntype Timestamp = UTCTime\n\ntype Checksum = BSL.ByteString\n\ntype DBVersion = Int\n\ndata Meta\n  = Meta Timestamp Checksum\n\n-- | Database version the software expects. If the software version is\n-- higher than the database version or the database has not been updated in more\n-- than 7.5 days, the database will be deleted and rebuilt from scratch. Bump\n-- this when the database layout changes or the build-time data filtering\n-- changes.\nsoftwareVersion :: DBVersion\nsoftwareVersion = 2\n\ngetDBPath :: IO FilePath\ngetDBPath = do\n  cacheDir <- getXdgDirectory XdgCache \"nixpkgs-update\"\n  createDirectoryIfMissing True cacheDir\n  pure $ cacheDir </> \"nvd.sqlite3\"\n\nwithDB :: (Connection -> IO a) -> IO a\nwithDB action = do\n  dbPath <- getDBPath\n  withConnection dbPath action\n\nmarkUpdated :: Connection -> IO ()\nmarkUpdated conn = do\n  now <- getCurrentTime\n  execute conn \"UPDATE meta SET last_update = ?\" [now]\n\n-- | Rebuild the entire database, redownloading all data.\nrebuildDB :: IO ()\nrebuildDB = do\n  dbPath <- getDBPath\n  removeFile dbPath\n  withConnection dbPath $ \\conn -> do\n    execute_ conn \"CREATE TABLE meta (db_version int, last_update text)\"\n    execute\n      conn\n      \"INSERT INTO meta VALUES (?, ?)\"\n      (softwareVersion, \"1970-01-01 00:00:00\" :: Text)\n    execute_ conn $\n      Query $\n        T.unlines\n          [ \"CREATE TABLE cves (\",\n            \"  cve_id text PRIMARY KEY,\",\n            \"  description text,\",\n            \"  published text,\",\n            \"  modified text)\"\n          ]\n    execute_ conn $\n      Query $\n        T.unlines\n          [ \"CREATE TABLE cpe_matches (\",\n            \"  cve_id text REFERENCES cve,\",\n            \"  part text,\",\n            \"  vendor text,\",\n            \"  product text,\",\n            \"  version text,\",\n            \"  \\\"update\\\" text,\",\n            \"  edition text,\",\n            \"  language text,\",\n            \"  software_edition text,\",\n            \"  target_software text,\",\n            \"  target_hardware text,\",\n            \"  other text,\",\n            \"  matcher text)\"\n          ]\n    execute_ conn \"CREATE INDEX matchers_by_cve ON cpe_matches(cve_id)\"\n    execute_ conn \"CREATE INDEX matchers_by_product ON cpe_matches(product)\"\n    execute_ conn \"CREATE INDEX matchers_by_vendor ON cpe_matches(vendor)\"\n    execute_\n      conn\n      \"CREATE INDEX matchers_by_target_software ON cpe_matches(target_software)\"\n    years <- allYears\n    forM_ years $ updateFeed conn\n    markUpdated conn\n\nfeedURL :: FeedID -> Extension -> String\nfeedURL feed ext =\n  \"https://nvd.nist.gov/feeds/json/cve/1.1/nvdcve-1.1-\" <> feed <> ext\n\nthrowString :: String -> IO a\nthrowString = ioError . userError\n\nthrowText :: Text -> IO a\nthrowText = throwString . T.unpack\n\nallYears :: IO [FeedID]\nallYears = do\n  now <- getCurrentTime\n  let (year, _, _) = toGregorian $ utctDay now\n  return $ map show [2002 .. year]\n\nparseMeta :: BSL.ByteString -> Either T.Text Meta\nparseMeta raw = do\n  let splitLine = second BSL.tail . BSL.break (== ':') . BSL.takeWhile (/= '\\r')\n  let fields = map splitLine $ BSL.lines raw\n  lastModifiedDate <-\n    note \"no lastModifiedDate in meta\" $ lookup \"lastModifiedDate\" fields\n  sha256 <- note \"no sha256 in meta\" $ lookup \"sha256\" fields\n  timestamp <-\n    note \"invalid lastModifiedDate in meta\" $\n      parseISO8601 $\n        BSL.unpack lastModifiedDate\n  checksum <- note \"invalid sha256 in meta\" $ unhex sha256\n  return $ Meta timestamp checksum\n\ngetMeta :: FeedID -> IO Meta\ngetMeta feed = do\n  raw <- simpleHttp $ feedURL feed \".meta\"\n  either throwText pure $ parseMeta raw\n\ngetCVE :: Connection -> CVEID -> IO CVE\ngetCVE conn cveID_ = do\n  cves <-\n    query\n      conn\n      ( Query $\n          T.unlines\n            [ \"SELECT cve_id, description, published, modified\",\n              \"FROM cves\",\n              \"WHERE cve_id = ?\"\n            ]\n      )\n      (Only cveID_)\n  case cves of\n    [cve] -> pure cve\n    [] -> fail $ \"no cve with id \" <> (T.unpack cveID_)\n    _ -> fail $ \"multiple cves with id \" <> (T.unpack cveID_)\n\ngetCVEs :: Connection -> ProductID -> Version -> IO [CVE]\ngetCVEs conn productID version = do\n  matches :: [CPEMatchRow] <-\n    query\n      conn\n      ( Query $\n          T.unlines\n            [ \"SELECT\",\n              \"  cve_id,\",\n              \"  part,\",\n              \"  vendor,\",\n              \"  product,\",\n              \"  version,\",\n              \"  \\\"update\\\",\",\n              \"  edition,\",\n              \"  language,\",\n              \"  software_edition,\",\n              \"  target_software,\",\n              \"  target_hardware,\",\n              \"  other,\",\n              \"  matcher\",\n              \"FROM cpe_matches\",\n              \"WHERE vendor = ? or product = ? or edition = ? or software_edition = ? or target_software = ?\",\n              \"ORDER BY cve_id\"\n            ]\n      )\n      (productID, productID, productID, productID, productID)\n  let cveIDs =\n        map head $\n          group $\n            flip mapMaybe matches $\n              \\(CPEMatchRow cve cpeMatch) ->\n                if matchVersion (cpeMatchVersionMatcher cpeMatch) version\n                  then\n                    if NVDRules.filter cve cpeMatch productID version\n                      then Just (cveID cve)\n                      else Nothing\n                  else Nothing\n  forM cveIDs $ getCVE conn\n\nputCVEs :: Connection -> [CVE] -> IO ()\nputCVEs conn cves = do\n  withTransaction conn $ do\n    executeMany\n      conn\n      \"DELETE FROM cves WHERE cve_id = ?\"\n      (map (Only . cveID) cves)\n    executeMany\n      conn\n      ( Query $\n          T.unlines\n            [ \"INSERT INTO cves(cve_id, description, published, modified)\",\n              \"VALUES (?, ?, ?, ?)\"\n            ]\n      )\n      cves\n    executeMany\n      conn\n      \"DELETE FROM cpe_matches WHERE cve_id = ?\"\n      (map (Only . cveID) cves)\n    executeMany\n      conn\n      ( Query $\n          T.unlines\n            [ \"INSERT INTO cpe_matches(\",\n              \"  cve_id,\",\n              \"  part,\",\n              \"  vendor,\",\n              \"  product,\",\n              \"  version,\",\n              \"  \\\"update\\\",\",\n              \"  edition,\",\n              \"  language,\",\n              \"  software_edition,\",\n              \"  target_software,\",\n              \"  target_hardware,\",\n              \"  other,\",\n              \"  matcher)\",\n              \"VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\"\n            ]\n      )\n      (cpeMatches cves)\n\ngetDBMeta :: Connection -> IO (DBVersion, UTCTime)\ngetDBMeta conn = do\n  rows <- query conn \"SELECT db_version, last_update FROM meta\" ()\n  case rows of\n    [meta] -> pure meta\n    _ -> fail \"failed to get meta information\"\n\nneedsRebuild :: IO Bool\nneedsRebuild = do\n  dbMeta <- try $ withDB getDBMeta\n  currentTime <- getCurrentTime\n  case dbMeta of\n    Left (e :: SomeException) -> do\n      putStrLn $ \"rebuilding database because \" <> show e\n      pure True\n    Right (dbVersion, t) ->\n      pure $\n        diffUTCTime currentTime t > (7.5 * nominalDay)\n          || dbVersion /= softwareVersion\n\n-- | Download a feed and store it in the database.\nupdateFeed :: Connection -> FeedID -> IO ()\nupdateFeed conn feedID = do\n  putStrLn $ \"Updating National Vulnerability Database feed (\" <> feedID <> \")\"\n  json <- downloadFeed feedID\n  parsedCVEs <- either throwText pure $ parseFeed json\n  putCVEs conn parsedCVEs\n\n-- | Update the vulnerability database and run an action with a connection to\n-- it.\nwithVulnDB :: (Connection -> IO a) -> IO a\nwithVulnDB action = do\n  rebuild <- needsRebuild\n  when rebuild rebuildDB\n  withDB $ \\conn -> do\n    (_, lastUpdate) <- withDB getDBMeta\n    currentTime <- getCurrentTime\n    when (diffUTCTime currentTime lastUpdate > (0.25 * nominalDay)) $ do\n      updateFeed conn \"modified\"\n      markUpdated conn\n    action conn\n\n-- | Update a feed if it's older than a maximum age and return the contents as\n-- ByteString.\ndownloadFeed :: FeedID -> IO BSL.ByteString\ndownloadFeed feed = do\n  Meta _ expectedChecksum <- getMeta feed\n  compressed <- simpleHttp $ feedURL feed \".json.gz\"\n  let raw = decompress compressed\n  let actualChecksum = BSL.fromStrict $ hashlazy raw\n  when (actualChecksum /= expectedChecksum) $\n    throwString $\n      \"wrong hash, expected: \"\n        <> BSL.unpack (hex expectedChecksum)\n        <> \" got: \"\n        <> BSL.unpack (hex actualChecksum)\n  return raw\n"
  },
  {
    "path": "src/NVDRules.hs",
    "content": "{-# LANGUAGE OverloadedStrings #-}\n\nmodule NVDRules where\n\nimport CVE (CPE (..), CPEMatch (..), CVE (..))\nimport Data.Char (isDigit)\nimport qualified Data.Text as T\nimport OurPrelude\nimport Text.Regex.Applicative.Text (RE', anySym, many, psym, (=~))\nimport Utils (Boundary (..), ProductID, Version, VersionMatcher (..))\n\n-- Return False to discard CVE\nfilter :: CVE -> CPEMatch -> ProductID -> Version -> Bool\nfilter _ cpeMatch \"socat\" v\n  | cpeUpdatePresentAndNotPartOfVersion cpeMatch v = False -- TODO consider if this rule should be applied to all packages\nfilter _ cpeMatch \"uzbl\" v\n  | isNothing (v =~ yearRegex)\n      && \"2009.12.22\"\n      `anyVersionInfixOf` cpeMatchVersionMatcher cpeMatch =\n      False\n  | isNothing (v =~ yearRegex)\n      && \"2010.04.03\"\n      `anyVersionInfixOf` cpeMatchVersionMatcher cpeMatch =\n      False\nfilter _ cpeMatch \"go\" v\n  | \".\"\n      `T.isInfixOf` v\n      && \"-\"\n      `anyVersionInfixOf` cpeMatchVersionMatcher cpeMatch =\n      False\nfilter _ cpeMatch \"terraform\" _\n  | cpeTargetSoftware (cpeMatchCPE cpeMatch) == Just \"aws\" = False\nfilter cve _ \"tor\" _\n  | cveID cve == \"CVE-2017-16541\" = False\nfilter _ cpeMatch \"arena\" _\n  | cpeVendor (cpeMatchCPE cpeMatch) == Just \"rockwellautomation\"\n      || cpeVendor (cpeMatchCPE cpeMatch) == Just \"openforis\" =\n      False\nfilter _ cpeMatch \"thrift\" _\n  | cpeVendor (cpeMatchCPE cpeMatch) == Just \"facebook\" = False\nfilter _ cpeMatch \"kanboard\" _\n  | cpeTargetSoftware (cpeMatchCPE cpeMatch) == Just \"jenkins\" = False\nfilter _cve _match _productID _version = True\n\nanyVersionInfixOf :: Text -> VersionMatcher -> Bool\nanyVersionInfixOf t (SingleMatcher v) = t `T.isInfixOf` v\nanyVersionInfixOf t (RangeMatcher (Including v1) (Including v2)) =\n  t `T.isInfixOf` v1 || t `T.isInfixOf` v2\nanyVersionInfixOf t (RangeMatcher (Excluding v1) (Excluding v2)) =\n  t `T.isInfixOf` v1 || t `T.isInfixOf` v2\nanyVersionInfixOf t (RangeMatcher (Including v1) (Excluding v2)) =\n  t `T.isInfixOf` v1 || t `T.isInfixOf` v2\nanyVersionInfixOf t (RangeMatcher (Excluding v1) (Including v2)) =\n  t `T.isInfixOf` v1 || t `T.isInfixOf` v2\nanyVersionInfixOf t (RangeMatcher Unbounded (Including v)) = t `T.isInfixOf` v\nanyVersionInfixOf t (RangeMatcher Unbounded (Excluding v)) = t `T.isInfixOf` v\nanyVersionInfixOf t (RangeMatcher (Including v) Unbounded) = t `T.isInfixOf` v\nanyVersionInfixOf t (RangeMatcher (Excluding v) Unbounded) = t `T.isInfixOf` v\nanyVersionInfixOf _ (RangeMatcher Unbounded Unbounded) = False\n\n-- Four digits at the start followed by any number of anything else\nyearRegex :: RE' ()\nyearRegex =\n  void $\n    psym isDigit <* psym isDigit <* psym isDigit <* psym isDigit <* many anySym\n\ncpeUpdatePresentAndNotPartOfVersion :: CPEMatch -> Version -> Bool\ncpeUpdatePresentAndNotPartOfVersion cpeMatch v =\n  maybe\n    False\n    (\\update -> not (update `T.isInfixOf` v))\n    (cpeUpdate (cpeMatchCPE cpeMatch))\n"
  },
  {
    "path": "src/Nix.hs",
    "content": "{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE ScopedTypeVariables #-}\n{-# LANGUAGE TemplateHaskell #-}\n\nmodule Nix\n  ( assertNewerVersion,\n    assertOldVersionOn,\n    binPath,\n    build,\n    getAttr,\n    getAttrString,\n    getChangelog,\n    getDerivationFile,\n    getDescription,\n    getHash,\n    getHashFromBuild,\n    getHomepage,\n    getMaintainers,\n    getPatches,\n    getSrcUrl,\n    hasPatchNamed,\n    hasUpdateScript,\n    lookupAttrPath,\n    numberOfFetchers,\n    numberOfHashes,\n    resultLink,\n    runUpdateScript,\n    fakeHashMatching,\n    version,\n    Raw (..),\n  )\nwhere\n\nimport Data.Maybe (fromJust)\nimport qualified Data.Text as T\nimport qualified Git\nimport Language.Haskell.TH.Env (envQ)\nimport OurPrelude\nimport System.Exit ()\nimport qualified System.Process.Typed as TP\nimport Utils (UpdateEnv (..), nixBuildOptions, nixCommonOptions, srcOrMain)\nimport Prelude hiding (log)\n\nbinPath :: String\nbinPath = fromJust ($$(envQ \"NIX\") :: Maybe String) <> \"/bin\"\n\ndata Env = Env [(String, String)]\n\ndata Raw\n  = Raw\n  | NoRaw\n\ndata EvalOptions = EvalOptions Raw Env\n\nrawOpt :: Raw -> [String]\nrawOpt Raw = [\"--raw\"]\nrawOpt NoRaw = []\n\nnixEvalApply ::\n  MonadIO m =>\n  Text ->\n  Text ->\n  ExceptT Text m Text\nnixEvalApply applyFunc attrPath =\n  ourReadProcess_\n    (proc (binPath <> \"/nix\") ([\"--extra-experimental-features\", \"nix-command\", \"--extra-experimental-features\", \"flakes\", \"eval\", \".#\" <> T.unpack attrPath, \"--apply\", T.unpack applyFunc]))\n    & fmapRT (fst >>> T.strip)\n\nnixEvalApplyRaw ::\n  MonadIO m =>\n  Text ->\n  Text ->\n  ExceptT Text m Text\nnixEvalApplyRaw applyFunc attrPath =\n  ourReadProcess_\n    (proc (binPath <> \"/nix\") ([\"--extra-experimental-features\", \"nix-command\", \"--extra-experimental-features\", \"flakes\", \"eval\", \".#\" <> T.unpack attrPath, \"--raw\", \"--apply\", T.unpack applyFunc]))\n    & fmapRT (fst >>> T.strip)\n\nnixEvalExpr ::\n  MonadIO m =>\n  Text ->\n  ExceptT Text m Text\nnixEvalExpr expr =\n  ourReadProcess_\n    (proc (binPath <> \"/nix\") ([\"--extra-experimental-features\", \"nix-command\", \"eval\", \"--expr\", T.unpack expr]))\n    & fmapRT (fst >>> T.strip)\n\n-- Error if the \"new version\" is actually newer according to nix\nassertNewerVersion :: MonadIO m => UpdateEnv -> ExceptT Text m ()\nassertNewerVersion updateEnv = do\n  versionComparison <-\n    nixEvalExpr\n      ( \"(builtins.compareVersions \\\"\"\n          <> newVersion updateEnv\n          <> \"\\\" \\\"\"\n          <> oldVersion updateEnv\n          <> \"\\\")\"\n      )\n  case versionComparison of\n    \"1\" -> return ()\n    a ->\n      throwE\n        ( newVersion updateEnv\n            <> \" is not newer than \"\n            <> oldVersion updateEnv\n            <> \" according to Nix; versionComparison: \"\n            <> a\n            <> \" \"\n        )\n\n-- This is extremely slow but gives us the best results we know of\nlookupAttrPath :: MonadIO m => UpdateEnv -> ExceptT Text m Text\nlookupAttrPath updateEnv =\n  -- lookup attrpath by nix-env\n  ( proc\n      (binPath <> \"/nix-env\")\n      ( [ \"-qa\",\n          (packageName updateEnv <> \"-\" <> oldVersion updateEnv) & T.unpack,\n          \"-f\",\n          \".\",\n          \"--attr-path\"\n        ]\n          <> nixCommonOptions\n      )\n      & ourReadProcess_\n      & fmapRT (fst >>> T.lines >>> head >>> T.words >>> head)\n  )\n    <|>\n    -- if that fails, check by attrpath\n    (getAttrString \"name\" (packageName updateEnv))\n    & fmapRT (const (packageName updateEnv))\n\ngetDerivationFile :: MonadIO m => Text -> ExceptT Text m Text\ngetDerivationFile attrPath = do\n  npDir <- liftIO $ Git.nixpkgsDir\n  proc \"env\" [\"EDITOR=echo\", (binPath <> \"/nix\"), \"--extra-experimental-features\", \"nix-command\", \"edit\", attrPath & T.unpack, \"-f\", \".\"]\n    & ourReadProcess_\n    & fmapRT (fst >>> T.strip >>> T.stripPrefix (T.pack npDir <> \"/\") >>> fromJust)\n\n-- Get an attribute that can be evaluated off a derivation, as in:\n-- getAttr \"cargoHash\" \"ripgrep\" -> 0lwz661rbm7kwkd6mallxym1pz8ynda5f03ynjfd16vrazy2dj21\ngetAttr :: MonadIO m => Text -> Text -> ExceptT Text m Text\ngetAttr attr = srcOrMain (nixEvalApply (\"p: p.\" <> attr))\n\ngetAttrString :: MonadIO m => Text -> Text -> ExceptT Text m Text\ngetAttrString attr = srcOrMain (nixEvalApplyRaw (\"p: p.\" <> attr))\n\ngetHash :: MonadIO m => Text -> ExceptT Text m Text\ngetHash = getAttrString \"drvAttrs.outputHash\"\n\ngetMaintainers :: MonadIO m => Text -> ExceptT Text m Text\ngetMaintainers =\n  nixEvalApplyRaw \"p: let gh = m : m.github or \\\"\\\"; nonempty = s: s != \\\"\\\"; addAt = s: \\\"@\\\"+s; in builtins.concatStringsSep \\\" \\\" (map addAt (builtins.filter nonempty (map gh p.meta.maintainers or [])))\"\n\nreadNixBool :: MonadIO m => ExceptT Text m Text -> ExceptT Text m Bool\nreadNixBool t = do\n  text <- t\n  case text of\n    \"true\" -> return True\n    \"false\" -> return False\n    a -> throwE (\"Failed to read expected nix boolean \" <> a <> \" \")\n\ngetChangelog :: MonadIO m => Text -> ExceptT Text m Text\ngetChangelog = nixEvalApplyRaw \"p: p.meta.changelog or \\\"\\\"\"\n\ngetDescription :: MonadIO m => Text -> ExceptT Text m Text\ngetDescription = nixEvalApplyRaw \"p: p.meta.description or \\\"\\\"\"\n\ngetHomepage :: MonadIO m => Text -> ExceptT Text m Text\ngetHomepage = nixEvalApplyRaw \"p: p.meta.homepage or \\\"\\\"\"\n\ngetSrcUrl :: MonadIO m => Text -> ExceptT Text m Text\ngetSrcUrl =\n  srcOrMain\n    (nixEvalApplyRaw \"p: builtins.elemAt p.drvAttrs.urls 0\")\n\nbuildCmd :: Text -> ProcessConfig () () ()\nbuildCmd attrPath =\n  silently $ proc (binPath <> \"/nix-build\") (nixBuildOptions ++ [\"-A\", attrPath & T.unpack])\n\nlog :: Text -> ProcessConfig () () ()\nlog attrPath = proc (binPath <> \"/nix\") ([\"--extra-experimental-features\", \"nix-command\", \"log\", \"-f\", \".\", attrPath & T.unpack] <> nixCommonOptions)\n\nbuild :: MonadIO m => Text -> ExceptT Text m ()\nbuild attrPath =\n  (buildCmd attrPath & runProcess_ & tryIOTextET)\n    <|> ( do\n            _ <- buildFailedLog\n            throwE \"nix log failed trying to get build logs \"\n        )\n  where\n    buildFailedLog = do\n      buildLog <-\n        ourReadProcessInterleaved_ (log attrPath)\n          & fmap (T.lines >>> reverse >>> take 30 >>> reverse >>> T.unlines)\n      throwE (\"nix build failed.\\n\" <> buildLog <> \" \")\n\nnumberOfFetchers :: Text -> Int\nnumberOfFetchers derivationContents =\n  countUp \"fetchurl {\" + countUp \"fetchgit {\" + countUp \"fetchFromGitHub {\"\n  where\n    countUp x = T.count x derivationContents\n\n-- Sum the number of things that look like fixed-output derivation hashes\nnumberOfHashes :: Text -> Int\nnumberOfHashes derivationContents =\n  sum $ map countUp [\"sha256 =\", \"sha256=\", \"cargoHash =\", \"vendorHash =\", \"hash =\", \"npmDepsHash =\"]\n  where\n    countUp x = T.count x derivationContents\n\nassertOldVersionOn ::\n  MonadIO m => UpdateEnv -> Text -> Text -> ExceptT Text m ()\nassertOldVersionOn updateEnv branchName contents =\n  tryAssert\n    (\"Old version \" <> oldVersionPattern <> \" not present in \" <> branchName <> \" derivation file with contents: \" <> contents)\n    (oldVersionPattern `T.isInfixOf` contents)\n  where\n    oldVersionPattern = oldVersion updateEnv <> \"\\\"\"\n\nresultLink :: MonadIO m => ExceptT Text m Text\nresultLink =\n  T.strip\n    <$> ( ourReadProcessInterleaved_ \"readlink ./result\"\n            <|> ourReadProcessInterleaved_ \"readlink ./result-bin\"\n            <|> ourReadProcessInterleaved_ \"readlink ./result-dev\"\n            <|> ourReadProcessInterleaved_ \"readlink ./result-lib\"\n        )\n    <|> throwE \"Could not find result link. \"\n\nfakeHashMatching :: Text -> Text\nfakeHashMatching oldHash =\n  if \"sha512-\" `T.isPrefixOf` oldHash\n    then \"sha512-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA==\"\n    else \"sha256-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=\"\n\n-- fixed-output derivation produced path '/nix/store/fg2hz90z5bc773gpsx4gfxn3l6fl66nw-source' with sha256 hash '0q1lsgc1621czrg49nmabq6am9sgxa9syxrwzlksqqr4dyzw4nmf' instead of the expected hash '0bp22mzkjy48gncj5vm9b7whzrggcbs5pd4cnb6k8jpl9j02dhdv'\ngetHashFromBuild :: MonadIO m => Text -> ExceptT Text m Text\ngetHashFromBuild =\n  srcOrMain\n    ( \\attrPath -> do\n        (exitCode, _, stderr) <- buildCmd attrPath & readProcess\n        when (exitCode == ExitSuccess) $ throwE \"build succeeded unexpectedly\"\n        let stdErrText = bytestringToText stderr\n        let firstSplit = T.splitOn \"got:    \" stdErrText\n        firstSplitSecondPart <-\n          tryAt\n            (\"stderr did not split as expected full stderr was: \\n\" <> stdErrText)\n            firstSplit\n            1\n        let secondSplit = T.splitOn \"\\n\" firstSplitSecondPart\n        tryHead\n          ( \"stderr did not split second part as expected full stderr was: \\n\"\n              <> stdErrText\n              <> \"\\nfirstSplitSecondPart:\\n\"\n              <> firstSplitSecondPart\n          )\n          secondSplit\n    )\n\nversion :: MonadIO m => ExceptT Text m Text\nversion = ourReadProcessInterleaved_ (proc (binPath <> \"/nix\") [\"--version\"])\n\ngetPatches :: MonadIO m => Text -> ExceptT Text m Text\ngetPatches =\n  nixEvalApply \"p: map (patch: patch.name) p.patches\"\n\nhasPatchNamed :: MonadIO m => Text -> Text -> ExceptT Text m Bool\nhasPatchNamed attrPath name = do\n  ps <- getPatches attrPath\n  return $ name `T.isInfixOf` ps\n\nhasUpdateScript :: MonadIO m => Text -> ExceptT Text m Bool\nhasUpdateScript attrPath = do\n  nixEvalApply\n    \"p: builtins.hasAttr \\\"updateScript\\\" p\"\n    attrPath\n    & readNixBool\n\nrunUpdateScript :: MonadIO m => Text -> ExceptT Text m (ExitCode, Text)\nrunUpdateScript attrPath = do\n  let timeout = \"30m\" :: Text\n  (exitCode, output) <-\n    ourReadProcessInterleaved $\n      TP.setStdin (TP.byteStringInput \"\\n\") $\n        proc \"timeout\" [T.unpack timeout, \"env\", \"NIXPKGS_ALLOW_UNFREE=1\", \"nix-shell\", \"maintainers/scripts/update.nix\", \"--argstr\", \"package\", T.unpack attrPath]\n  case exitCode of\n    ExitFailure 124 -> do\n      return (exitCode, \"updateScript for \" <> attrPath <> \" took longer than \" <> timeout <> \" and timed out. Other output: \" <> output)\n    _ -> do\n      return (exitCode, output)\n"
  },
  {
    "path": "src/NixpkgsReview.hs",
    "content": "{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE TemplateHaskell #-}\n\nmodule NixpkgsReview\n  ( cacheDir,\n    runReport,\n  )\nwhere\n\nimport Data.Maybe (fromJust)\nimport Data.Text as T\nimport qualified File as F\nimport Language.Haskell.TH.Env (envQ)\nimport OurPrelude\nimport Polysemy.Output (Output, output)\nimport qualified Process as P\nimport System.Directory (doesFileExist)\nimport System.Environment.XDG.BaseDir (getUserCacheDir)\nimport System.Exit ()\nimport qualified Utils\nimport Prelude hiding (log)\n\nbinPath :: String\nbinPath = fromJust ($$(envQ \"NIXPKGSREVIEW\") :: Maybe String) <> \"/bin\"\n\ncacheDir :: IO FilePath\ncacheDir = getUserCacheDir \"nixpkgs-review\"\n\nrevDir :: FilePath -> Text -> FilePath\nrevDir cache commit = cache <> \"/rev-\" <> T.unpack commit\n\nrun ::\n  Members '[F.File, P.Process, Output Text, Embed IO] r =>\n  FilePath ->\n  Text ->\n  Sem r Text\nrun cache commit =\n  let timeout = \"180m\" :: Text\n   in do\n        -- TODO: probably just skip running nixpkgs-review if the directory\n        -- already exists\n        void $\n          ourReadProcessInterleavedSem $\n            proc \"rm\" [\"-rf\", revDir cache commit]\n        (exitCode, _nixpkgsReviewOutput) <-\n          ourReadProcessInterleavedSem $\n            proc \"timeout\" [T.unpack timeout, (binPath <> \"/nixpkgs-review\"), \"rev\", T.unpack commit, \"--no-shell\"]\n        case exitCode of\n          ExitFailure 124 -> do\n            output $ \"[check][nixpkgs-review] took longer than \" <> timeout <> \" and timed out\"\n            return $ \":warning: nixpkgs-review took longer than \" <> timeout <> \" and timed out\"\n          _ -> do\n            reportExists <- embed $ doesFileExist (revDir cache commit <> \"/report.md\")\n            if reportExists\n              then F.read $ (revDir cache commit) <> \"/report.md\"\n              else do\n                output $ \"[check][nixpkgs-review] report.md does not exist\"\n                return $ \":x: nixpkgs-review failed\"\n\n-- Assumes we are already in nixpkgs dir\nrunReport :: (Text -> IO ()) -> Text -> IO Text\nrunReport log commit = do\n  log \"[check][nixpkgs-review]\"\n  c <- cacheDir\n  msg <-\n    runFinal\n      . embedToFinal\n      . F.runIO\n      . P.runIO\n      . Utils.runLog log\n      $ NixpkgsReview.run c commit\n  log msg\n  return msg\n"
  },
  {
    "path": "src/OurPrelude.hs",
    "content": "{-# LANGUAGE PartialTypeSignatures #-}\n\nmodule OurPrelude\n  ( (>>>),\n    (<|>),\n    (<>),\n    (</>),\n    (<&>),\n    (&),\n    module Control.Error,\n    module Control.Monad.Except,\n    module Control.Monad.Trans.Class,\n    module Control.Monad.IO.Class,\n    module Data.Bifunctor,\n    module System.Process.Typed,\n    module Polysemy,\n    module Polysemy.Error,\n    ignoreExitCodeException,\n    Set,\n    Text,\n    Vector,\n    interpolate,\n    tshow,\n    tryIOTextET,\n    whenM,\n    ourReadProcess_,\n    ourReadProcess_Sem,\n    ourReadProcessInterleaved_,\n    ourReadProcessInterleavedBS_,\n    ourReadProcessInterleaved,\n    ourReadProcessInterleavedSem,\n    silently,\n    bytestringToText,\n  )\nwhere\n\nimport Control.Applicative ((<|>))\nimport Control.Category ((>>>))\nimport Control.Error\nimport qualified Control.Exception\nimport Control.Monad.Except\nimport Control.Monad.IO.Class\nimport Control.Monad.Trans.Class\nimport Data.Bifunctor\nimport qualified Data.ByteString.Lazy as BSL\nimport Data.Function ((&))\nimport Data.Functor ((<&>))\nimport Data.Set (Set)\nimport Data.Text (Text, pack)\nimport qualified Data.Text.Encoding as T\nimport qualified Data.Text.Encoding.Error as T\nimport Data.Vector (Vector)\nimport Language.Haskell.TH.Quote\nimport qualified NeatInterpolation\nimport Polysemy\nimport Polysemy.Error hiding (note, try, tryJust)\nimport qualified Process as P\nimport System.Exit\nimport System.FilePath ((</>))\nimport System.Process.Typed\n\ninterpolate :: QuasiQuoter\ninterpolate = NeatInterpolation.text\n\ntshow :: Show a => a -> Text\ntshow = show >>> pack\n\ntryIOTextET :: MonadIO m => IO a -> ExceptT Text m a\ntryIOTextET = syncIO >>> fmapLT tshow\n\nwhenM :: Monad m => m Bool -> m () -> m ()\nwhenM c a = c >>= \\res -> when res a\n\nbytestringToText :: BSL.ByteString -> Text\nbytestringToText = BSL.toStrict >>> (T.decodeUtf8With T.lenientDecode)\n\nourReadProcessInterleavedBS_ ::\n  MonadIO m =>\n  ProcessConfig stdin stdoutIgnored stderrIgnored ->\n  ExceptT Text m BSL.ByteString\nourReadProcessInterleavedBS_ = readProcessInterleaved_ >>> tryIOTextET\n\nourReadProcess_ ::\n  MonadIO m =>\n  ProcessConfig stdin stdout stderr ->\n  ExceptT Text m (Text, Text)\nourReadProcess_ = readProcess_ >>> tryIOTextET >>> fmapRT (\\(stdout, stderr) -> (bytestringToText stdout, bytestringToText stderr))\n\nourReadProcess_Sem ::\n  Members '[P.Process] r =>\n  ProcessConfig stdin stdoutIgnored stderrIgnored ->\n  Sem r (Text, Text)\nourReadProcess_Sem =\n  P.read_ >>> fmap (\\(stdout, stderr) -> (bytestringToText stdout, bytestringToText stderr))\n\nourReadProcessInterleaved_ ::\n  MonadIO m =>\n  ProcessConfig stdin stdoutIgnored stderrIgnored ->\n  ExceptT Text m Text\nourReadProcessInterleaved_ =\n  readProcessInterleaved_ >>> tryIOTextET >>> fmapRT bytestringToText\n\nourReadProcessInterleaved ::\n  MonadIO m =>\n  ProcessConfig stdin stdoutIgnored stderrIgnored ->\n  ExceptT Text m (ExitCode, Text)\nourReadProcessInterleaved =\n  readProcessInterleaved\n    >>> tryIOTextET\n    >>> fmapRT (\\(a, b) -> (a, bytestringToText b))\n\nourReadProcessInterleavedSem ::\n  Members '[P.Process] r =>\n  ProcessConfig stdin stdoutIgnored stderrIgnored ->\n  Sem r (ExitCode, Text)\nourReadProcessInterleavedSem =\n  P.readInterleaved\n    >>> fmap (\\(a, b) -> (a, bytestringToText b))\n\nsilently :: ProcessConfig stdin stdout stderr -> ProcessConfig () () ()\nsilently = setStderr closed >>> setStdin closed >>> setStdout closed\n\nignoreExitCodeException :: IO () -> IO ()\nignoreExitCodeException a = Control.Exception.catch a (\\(_e :: ExitCodeException) -> pure ())\n"
  },
  {
    "path": "src/Outpaths.hs",
    "content": "{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE QuasiQuotes #-}\n\nmodule Outpaths\n  ( currentOutpathSet,\n    currentOutpathSetUncached,\n    ResultLine,\n    dummyOutpathSetBefore,\n    dummyOutpathSetAfter,\n    packageRebuilds,\n    numPackageRebuilds,\n    outpathReport,\n  )\nwhere\n\nimport Data.List (sort)\nimport qualified Data.Set as S\nimport qualified Data.Text as T\nimport qualified Data.Text.IO as T\nimport qualified Data.Vector as V\nimport qualified Git\nimport OurPrelude\nimport qualified System.Directory\nimport qualified System.Posix.Files as F\nimport Text.Parsec (parse)\nimport Text.Parser.Char\nimport Text.Parser.Combinators\nimport qualified Utils\n\noutPathsExpr :: Text\noutPathsExpr =\n  [interpolate|\n{ checkMeta\n, path ? ./.\n}:\nlet\n  lib = import (path + \"/lib\");\n  hydraJobs = import (path + \"/pkgs/top-level/release.nix\")\n    # Compromise: accuracy vs. resources needed for evaluation.\n    # we only evaluate one architecture per OS as we most likely catch all\n    # mass-rebuilds this way.\n    {\n      supportedSystems = [\n        \"x86_64-linux\"\n     ];\n\n      nixpkgsArgs = {\n        config = {\n          allowUnfree = true;\n          allowInsecurePredicate = x: true;\n          checkMeta = checkMeta;\n\n          handleEvalIssue = reason: errormsg:\n            let\n              fatalErrors = [\n                \"unknown-meta\" \"broken-outputs\"\n              ];\n            in if builtins.elem reason fatalErrors\n              then abort errormsg\n              else true;\n\n          inHydra = true;\n        };\n      };\n    };\n  nixosJobs = import (path + \"/nixos/release.nix\") {\n    supportedSystems = [ \"x86_64-linux\" ];\n  };\n  recurseIntoAttrs = attrs: attrs // { recurseForDerivations = true; };\n\n  # hydraJobs leaves recurseForDerivations as empty attrmaps;\n  # that would break nix-env and we also need to recurse everywhere.\n  tweak = lib.mapAttrs\n    (name: val:\n      if name == \"recurseForDerivations\" then true\n      else if lib.isAttrs val && val.type or null != \"derivation\"\n              then recurseIntoAttrs (tweak val)\n      else val\n    );\n\n  # Some of these contain explicit references to platform(s) we want to avoid;\n  # some even (transitively) depend on ~/.nixpkgs/config.nix (!)\n  blacklist = [\n    \"tarball\" \"metrics\" \"manual\"\n    \"darwin-tested\" \"unstable\" \"stdenvBootstrapTools\"\n    \"moduleSystem\" \"lib-tests\" # these just confuse the output\n  ];\n\nin\n  tweak (\n    (builtins.removeAttrs hydraJobs blacklist)\n    // {\n      nixosTests.simple = nixosJobs.tests.simple;\n    }\n  )\n|]\n\noutPath :: MonadIO m => ExceptT Text m Text\noutPath = do\n  cacheDir <- liftIO $ Utils.outpathCacheDir\n  let outpathFile = (cacheDir </> \"outpaths.nix\")\n  liftIO $ T.writeFile outpathFile outPathsExpr\n  liftIO $ putStrLn \"[outpaths] eval start\"\n  currentDir <- liftIO $ System.Directory.getCurrentDirectory\n  result <-\n    ourReadProcessInterleaved_ $\n      proc\n        \"nix-env\"\n        [ \"-f\",\n          outpathFile,\n          \"-qaP\",\n          \"--no-name\",\n          \"--out-path\",\n          \"--arg\",\n          \"path\",\n          currentDir,\n          \"--arg\",\n          \"checkMeta\",\n          \"true\",\n          \"--show-trace\"\n        ]\n  liftIO $ putStrLn \"[outpaths] eval end\"\n  pure result\n\ndata Outpath = Outpath\n  { mayName :: Maybe Text,\n    storePath :: Text\n  }\n  deriving (Eq, Ord, Show)\n\ndata ResultLine = ResultLine\n  { package :: Text,\n    architecture :: Text,\n    outpaths :: Vector Outpath\n  }\n  deriving (Eq, Ord, Show)\n\n-- Example query result line:\n-- testInput :: Text\n-- testInput =\n--   \"haskellPackages.amazonka-dynamodb-streams.x86_64-linux                        doc=/nix/store/m4rpsc9nx0qcflh9ni6qdlg6hbkwpicc-amazonka-dynamodb-streams-1.6.0-doc;/nix/store/rvd4zydr22a7j5kgnmg5x6695c7bgqbk-amazonka-dynamodb-streams-1.6.0\\nhaskellPackages.agum.x86_64-darwin                                            doc=/nix/store/n526rc0pa5h0krdzsdni5agcpvcd3cb9-agum-2.7-doc;/nix/store/s59r75svbjm724q5iaprq4mln5k6wcr9-agum-2.7\"\ncurrentOutpathSet :: MonadIO m => ExceptT Text m (Set ResultLine)\ncurrentOutpathSet = do\n  rev <- Git.headRev\n  mayOp <- lift $ lookupOutPathByRev rev\n  op <- case mayOp of\n    Just paths -> pure paths\n    Nothing -> do\n      paths <- outPath\n      dir <- Utils.outpathCacheDir\n      let file = dir <> \"/\" <> T.unpack rev\n      liftIO $ T.writeFile file paths\n      pure paths\n  parse parseResults \"outpath\" op & fmapL tshow & hoistEither\n\ncurrentOutpathSetUncached :: MonadIO m => ExceptT Text m (Set ResultLine)\ncurrentOutpathSetUncached = do\n  op <- outPath\n  parse parseResults \"outpath\" op & fmapL tshow & hoistEither\n\nlookupOutPathByRev :: MonadIO m => Text -> m (Maybe Text)\nlookupOutPathByRev rev = do\n  dir <- Utils.outpathCacheDir\n  let file = dir <> \"/\" <> T.unpack rev\n  fileExists <- liftIO $ F.fileExist file\n  case fileExists of\n    False -> return Nothing\n    True -> do\n      paths <- liftIO $ readFile file\n      return $ Just $ T.pack paths\n\ndummyOutpathSetBefore :: Text -> Set ResultLine\ndummyOutpathSetBefore attrPath = S.singleton (ResultLine attrPath \"x86-64\" (V.singleton (Outpath (Just \"attrPath\") \"fakepath\")))\n\ndummyOutpathSetAfter :: Text -> Set ResultLine\ndummyOutpathSetAfter attrPath = S.singleton (ResultLine attrPath \"x86-64\" (V.singleton (Outpath (Just \"attrPath\") \"fakepath-edited\")))\n\nparseResults :: CharParsing m => m (Set ResultLine)\nparseResults = S.fromList <$> parseResultLine `sepEndBy` newline\n\nparseResultLine :: CharParsing m => m ResultLine\nparseResultLine =\n  ResultLine\n    <$> (T.dropWhileEnd (== '.') <$> parseAttrpath)\n    <*> parseArchitecture\n    <* spaces\n    <*> parseOutpaths\n\nparseAttrpath :: CharParsing m => m Text\nparseAttrpath = T.concat <$> many (try parseAttrpathPart)\n\nparseAttrpathPart :: CharParsing m => m Text\nparseAttrpathPart = T.snoc <$> (T.pack <$> many (noneOf \". \")) <*> char '.'\n\nparseArchitecture :: CharParsing m => m Text\nparseArchitecture = T.pack <$> many (noneOf \" \")\n\nparseOutpaths :: CharParsing m => m (Vector Outpath)\nparseOutpaths = V.fromList <$> (parseOutpath `sepBy1` char ';')\n\nparseOutpath :: CharParsing m => m Outpath\nparseOutpath =\n  Outpath\n    <$> optional (try (T.pack <$> (many (noneOf \"=\\n\") <* char '=')))\n    <*> (T.pack <$> many (noneOf \";\\n\"))\n\npackageRebuilds :: Set ResultLine -> Vector Text\npackageRebuilds = S.toList >>> fmap package >>> sort >>> V.fromList >>> V.uniq\n\nnumPackageRebuilds :: Set ResultLine -> Int\nnumPackageRebuilds diff = V.length $ packageRebuilds diff\n\noutpathReport :: Set ResultLine -> Text\noutpathReport diff =\n  let pkg = tshow $ V.length $ packageRebuilds diff\n      firstFifty = T.unlines $ V.toList $ V.take 50 $ packageRebuilds diff\n      numPaths = tshow $ S.size diff\n   in [interpolate|\n        $numPaths total rebuild path(s)\n\n        $pkg package rebuild(s)\n\n        First fifty rebuilds by attrpath\n        $firstFifty\n      |]\n"
  },
  {
    "path": "src/Process.hs",
    "content": "{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE TemplateHaskell #-}\n\nmodule Process where\n\nimport qualified Data.ByteString.Lazy as BSL\nimport Polysemy\nimport Polysemy.Input\nimport System.Exit (ExitCode (..))\nimport qualified System.Process.Typed as TP\n\ndata Process m a where\n  Read_ :: TP.ProcessConfig stdin stdout stderr -> Process m (BSL.ByteString, BSL.ByteString)\n  ReadInterleaved_ :: TP.ProcessConfig stdin stdout stderr -> Process m BSL.ByteString\n  ReadInterleaved :: TP.ProcessConfig stdin stdout stderr -> Process m (ExitCode, BSL.ByteString)\n\nmakeSem ''Process\n\nrunIO ::\n  Member (Embed IO) r =>\n  Sem (Process ': r) a ->\n  Sem r a\nrunIO =\n  interpret $ \\case\n    Read_ config -> embed $ (TP.readProcess_ config)\n    ReadInterleaved_ config -> embed $ (TP.readProcessInterleaved_ config)\n    ReadInterleaved config -> embed $ (TP.readProcessInterleaved config)\n\nrunPure ::\n  [BSL.ByteString] ->\n  Sem (Process ': r) a ->\n  Sem r a\nrunPure outputList =\n  runInputList outputList\n    . reinterpret \\case\n      Read_ _config -> do\n        r <- maybe \"\" id <$> input\n        return (r, \"\")\n      ReadInterleaved_ _config -> maybe \"\" id <$> input\n      ReadInterleaved _config -> do\n        r <- maybe \"\" id <$> input\n        return (ExitSuccess, r)\n"
  },
  {
    "path": "src/Repology.hs",
    "content": "{-# LANGUAGE DeriveAnyClass #-}\n{-# LANGUAGE DeriveGeneric #-}\n{-# LANGUAGE OverloadedStrings #-}\n\nmodule Repology where\n\nimport Control.Applicative (liftA2)\nimport Control.Concurrent (threadDelay)\nimport Data.Aeson\nimport Data.HashMap.Strict\nimport Data.List\nimport Data.Proxy\nimport qualified Data.Text as T\nimport qualified Data.Text.IO\nimport qualified Data.Vector as V\nimport GHC.Generics\nimport Network.HTTP.Client (managerModifyRequest, newManager, requestHeaders)\nimport Network.HTTP.Client.TLS (tlsManagerSettings)\nimport OurPrelude\nimport Servant.API\nimport Servant.Client (BaseUrl (..), ClientM, Scheme (..), client, mkClientEnv, runClientM)\nimport System.IO\n\nbaseUrl :: BaseUrl\nbaseUrl = BaseUrl Https \"repology.org\" 443 \"/api/v1\"\n\nrateLimit :: IO ()\nrateLimit = threadDelay 2000000\n\ntype Project = Vector Package\n\n-- compareProject :: Project -> Project -> Ordering\n-- compareProject ps1 ps2 = compareProject' (ps1 V.!? 0) (ps2 V.!? 0)\n--   where\n--     compareProject' (Just p1) (Just p2) = compare (name p1) (name p2)\n--     compareProject' Nothing (Just _) = LT\n--     compareProject' (Just _) Nothing = GT\n--     compareProject' _ _ = EQ\n\ntype Projects = HashMap Text Project\n\ntype API =\n  \"project\" :> Capture \"project_name\" Text :> Get '[JSON] Project\n    :<|> \"projects\" :> QueryParam \"inrepo\" Text :> QueryParam \"outdated\" Bool :> Get '[JSON] Projects\n    :<|> \"projects\" :> Capture \"name\" Text :> QueryParam \"inrepo\" Text :> QueryParam \"outdated\" Bool :> Get '[JSON] Projects\n\ndata Package = Package\n  { repo :: Text,\n    srcname :: Maybe Text, -- corresponds to attribute path\n    visiblename :: Text, -- corresponds to pname\n    version :: Text,\n    origversion :: Maybe Text,\n    status :: Maybe Text,\n    summary :: Maybe Text,\n    categories :: Maybe (Vector Text),\n    licenses :: Maybe (Vector Text)\n  }\n  deriving (Eq, Show, Generic, FromJSON)\n\napi :: Proxy API\napi = Proxy\n\nproject :: Text -> ClientM (Vector Package)\nprojects ::\n  Maybe Text ->\n  Maybe Bool ->\n  ClientM Projects\nprojects' ::\n  Text ->\n  Maybe Text ->\n  Maybe Bool ->\n  ClientM Projects\nproject :<|> projects :<|> projects' = client api\n\n-- type PagingResult = PagingResult (Vector Project, ClientM PagingResult)\n-- projects :: Text -> ClientM PagingResult\n-- projects n = do\n--   m <- ms n\n--   return (lastProjectName m, sortedProjects m)\nlastProjectName :: Projects -> Maybe Text\nlastProjectName = keys >>> sort >>> Prelude.reverse >>> headMay\n\n-- sortedProjects :: Projects -> Vector Project\n-- sortedProjects = elems >>> sortBy compareProject >>> V.fromList\n\nnixRepo :: Text\nnixRepo = \"nix_unstable\"\n\nnixOutdated :: ClientM Projects\nnixOutdated =\n  projects\n    (Just nixRepo)\n    (Just True)\n\nnextNixOutdated :: Text -> ClientM Projects\nnextNixOutdated n =\n  projects'\n    n\n    (Just nixRepo)\n    (Just True)\n\noutdatedForRepo :: Text -> Vector Package -> Maybe Package\noutdatedForRepo r =\n  V.find (\\p -> (status p) == Just \"outdated\" && (repo p) == r)\n\nnewest :: Vector Package -> Maybe Package\nnewest = V.find (\\p -> (status p) == Just \"newest\")\n\ngetUpdateInfo :: ClientM (Maybe Text, Bool, Vector (Text, (Package, Package)))\ngetUpdateInfo = do\n  liftIO rateLimit\n  outdated <- nixOutdated\n  let nixNew = toList $ Data.HashMap.Strict.mapMaybe (liftA2 (liftA2 (,)) (outdatedForRepo nixRepo) newest) outdated\n  let mLastName = lastProjectName outdated\n  liftIO $ hPutStrLn stderr $ show mLastName\n  liftIO $ hPutStrLn stderr $ show (size outdated)\n  return (mLastName, size outdated /= 1, V.fromList nixNew)\n\n--  let sorted = sortBy (\\(p1,_) (p2,_) -> compare (name p1) (name p2)) nixNew\ngetNextUpdateInfo ::\n  Text -> ClientM (Maybe Text, Bool, Vector (Text, (Package, Package)))\ngetNextUpdateInfo n = do\n  liftIO rateLimit\n  outdated <- nextNixOutdated n\n  let nixNew = toList $ Data.HashMap.Strict.mapMaybe (liftA2 (liftA2 (,)) (outdatedForRepo nixRepo) newest) outdated\n  let mLastName = lastProjectName outdated\n  liftIO $ hPutStrLn stderr $ show mLastName\n  liftIO $ hPutStrLn stderr $ show (size outdated)\n  return (mLastName, size outdated /= 1, V.fromList nixNew)\n\n-- Argument should be the Repology identifier of the project, not srcname/attrPath or visiblename/pname.\nrepologyUrl :: Text -> Text\nrepologyUrl projectName = \"https://repology.org/project/\" <> projectName <> \"/versions\"\n\n--  let sorted = sortBy (\\(p1,_) (p2,_) -> compare (name p1) (name p2)) nixNew\nupdateInfo :: (Text, (Package, Package)) -> Maybe Text\nupdateInfo (projectName, (outdated, newestP)) = do\n  attrPath <- srcname outdated\n  pure $ T.unwords [attrPath, version outdated, version newestP, repologyUrl projectName]\n\njusts :: Vector (Maybe a) -> Vector a\njusts = V.concatMap (maybeToList >>> V.fromList)\n\nmoreNixUpdateInfo ::\n  (Maybe Text, Vector (Package, Package)) ->\n  ClientM (Vector (Package, Package))\nmoreNixUpdateInfo (Nothing, acc) = do\n  (mLastName, moreWork, newNix) <- getUpdateInfo\n  liftIO $\n    V.sequence_ $\n      fmap Data.Text.IO.putStrLn $\n        justs $\n          fmap updateInfo newNix\n  if moreWork\n    then moreNixUpdateInfo (mLastName, fmap snd newNix V.++ acc)\n    else return acc\nmoreNixUpdateInfo (Just n, acc) = do\n  (mLastName, moreWork, newNix) <- getNextUpdateInfo n\n  liftIO $\n    V.sequence_ $\n      fmap Data.Text.IO.putStrLn $\n        justs $\n          fmap updateInfo newNix\n  if moreWork\n    then moreNixUpdateInfo (mLastName, fmap snd newNix V.++ acc)\n    else return acc\n\nallNixUpdateInfo :: ClientM (Vector (Package, Package))\nallNixUpdateInfo = moreNixUpdateInfo (Nothing, V.empty)\n\nfetch :: IO ()\nfetch = do\n  hSetBuffering stdout LineBuffering\n  hSetBuffering stderr LineBuffering\n  liftIO $ hPutStrLn stderr \"starting\"\n  let addUserAgent req = pure $ req {requestHeaders = (\"User-Agent\", \"https://github.com/nix-community/nixpkgs-update\") : requestHeaders req}\n  manager' <- newManager tlsManagerSettings {managerModifyRequest = addUserAgent}\n  e <- runClientM allNixUpdateInfo (mkClientEnv manager' baseUrl)\n  case e of\n    Left ce -> liftIO $ hPutStrLn stderr $ show ce\n    Right _ -> liftIO $ hPutStrLn stderr $ \"done\"\n  return ()\n"
  },
  {
    "path": "src/Rewrite.hs",
    "content": "{-# LANGUAGE MultiWayIf #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE RecordWildCards #-}\n{-# LANGUAGE ViewPatterns #-}\n\nmodule Rewrite\n  ( Args (..),\n    runAll,\n    golangModuleVersion,\n    rustCrateVersion,\n    version,\n    redirectedUrls,\n  )\nwhere\n\nimport qualified Data.Text as T\nimport Data.Text.Encoding as T\nimport Data.Text.Encoding.Error as T\nimport Data.Text.IO as T\nimport qualified File\nimport qualified Network.HTTP.Client as HTTP\nimport Network.HTTP.Types.Status (statusCode)\nimport qualified Nix\nimport OurPrelude\nimport System.Exit ()\nimport Utils (UpdateEnv (..))\nimport Prelude hiding (log)\n\n{-\n This module contains rewrite functions that make some modification to the\n nix derivation. These are in the IO monad so that they can do things like\n re-run nix-build to recompute hashes, but morally they should just stick to\n editing the derivationFile for their one stated purpose.\n\n The return contract is:\n - If it makes a modification, it should return a simple message to attach to\n   the pull request description to provide context or justification for code\n   reviewers (e.g., a GitHub issue or RFC).\n - If it makes no modification, return None\n - If it throws an exception, nixpkgs-update will be aborted for the package and\n   no other rewrite functions will run.\n\n  TODO: Setup some unit tests for these!\n-}\ndata Args = Args\n  { updateEnv :: Utils.UpdateEnv,\n    attrPath :: Text,\n    derivationFile :: FilePath,\n    derivationContents :: Text,\n    hasUpdateScript :: Bool\n  }\n\ntype Rewriter = (Text -> IO ()) -> Args -> ExceptT Text IO (Maybe Text)\n\ntype Plan = [(Text, Rewriter)]\n\nplan :: Plan\nplan =\n  [ (\"version\", version),\n    (\"rustCrateVersion\", rustCrateVersion),\n    (\"golangModuleVersion\", golangModuleVersion),\n    (\"npmDepsVersion\", npmDepsVersion),\n    (\"updateScript\", updateScript)\n    -- (\"redirectedUrl\", Rewrite.redirectedUrls)\n  ]\n\nrunAll :: (Text -> IO ()) -> Args -> ExceptT Text IO [Text]\nrunAll log args = do\n  msgs <- forM plan $ \\(name, f) -> do\n    let log' msg =\n          if T.null name\n            then log msg\n            else log $ (\"[\" <> name <> \"] \") <> msg\n    lift $ log' \"\" -- Print initial empty message to signal start of rewriter\n    f log' args\n  return $ catMaybes msgs\n\n--------------------------------------------------------------------------------\n-- The canonical updater: updates the src attribute and recomputes the sha256\nversion :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)\nversion log args@Args {..} = do\n  if\n      | Nix.numberOfFetchers derivationContents > 1 || Nix.numberOfHashes derivationContents > 1 -> do\n          lift $ log \"generic version rewriter does not support multiple hashes\"\n          return Nothing\n      | hasUpdateScript -> do\n          lift $ log \"skipping because derivation has updateScript\"\n          return Nothing\n      | otherwise -> do\n          srcVersionFix args\n          lift $ log \"updated version and sha256\"\n          return $ Just \"Version update\"\n\n--------------------------------------------------------------------------------\n-- Redirect homepage when moved.\nredirectedUrls :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)\nredirectedUrls log Args {..} = do\n  homepage <- Nix.getHomepage attrPath\n  response <- liftIO $ do\n    manager <- HTTP.newManager HTTP.defaultManagerSettings\n    request <- HTTP.parseRequest (T.unpack homepage)\n    HTTP.httpLbs request manager\n  let status = statusCode $ HTTP.responseStatus response\n  if status `elem` [301, 308]\n    then do\n      lift $ log \"Redirecting URL\"\n      let headers = HTTP.responseHeaders response\n          location = lookup \"Location\" headers\n      case location of\n        Nothing -> do\n          lift $ log \"Server did not return a location\"\n          return Nothing\n        Just ((T.decodeUtf8With T.lenientDecode) -> newHomepage) -> do\n          _ <- File.replaceIO homepage newHomepage derivationFile\n          lift $ log \"Replaced homepage\"\n          return $\n            Just $\n              \"Replaced homepage by \"\n                <> newHomepage\n                <> \" due http \"\n                <> (T.pack . show) status\n    else do\n      lift $ log \"URL not redirected\"\n      return Nothing\n\n--------------------------------------------------------------------------------\n-- Rewrite Rust on rustPlatform.buildRustPackage\n-- This is basically `version` above, but with a second pass to also update the\n-- cargoHash vendor hash.\nrustCrateVersion :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)\nrustCrateVersion log args@Args {..} = do\n  if\n      | (not (T.isInfixOf \"cargoHash\" derivationContents)) -> do\n          lift $ log \"No cargoHash found\"\n          return Nothing\n      | hasUpdateScript -> do\n          lift $ log \"skipping because derivation has updateScript\"\n          return Nothing\n      | otherwise -> do\n          -- This starts the same way `version` does, minus the assert\n          srcVersionFix args\n          -- But then from there we need to do this a second time for the cargoHash!\n          oldCargoHash <- Nix.getAttrString \"cargoHash\" attrPath\n          let fakeHash = Nix.fakeHashMatching oldCargoHash\n          _ <- lift $ File.replaceIO oldCargoHash fakeHash derivationFile\n          newCargoHash <- Nix.getHashFromBuild attrPath\n          when (oldCargoHash == newCargoHash) $ throwE (\"cargo hashes equal; no update necessary: \" <> oldCargoHash)\n          lift . log $ \"Replacing cargoHash with \" <> newCargoHash\n          _ <- lift $ File.replaceIO fakeHash newCargoHash derivationFile\n          -- Ensure the package actually builds and passes its tests\n          Nix.build attrPath\n          lift $ log \"Finished updating Crate version and replacing hashes\"\n          return $ Just \"Rust version update\"\n\n--------------------------------------------------------------------------------\n-- Rewrite Golang packages with buildGoModule\n-- This is basically `version` above, but with a second pass to also update the\n-- vendorHash go vendor hash.\ngolangModuleVersion :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)\ngolangModuleVersion log args@Args {..} = do\n  if\n      | not (T.isInfixOf \"buildGo\" derivationContents && T.isInfixOf \"vendorHash\" derivationContents) -> do\n          lift $ log \"Not a buildGoModule package with vendorHash\"\n          return Nothing\n      | hasUpdateScript -> do\n          lift $ log \"skipping because derivation has updateScript\"\n          return Nothing\n      | otherwise -> do\n          -- This starts the same way `version` does, minus the assert\n          srcVersionFix args\n          -- But then from there we need to do this a second time for the vendorHash!\n          -- Note that explicit `null` cannot be coerced to a string by nix eval --raw\n          oldVendorHash <- Nix.getAttr \"vendorHash\" attrPath\n          lift . log $ \"Found old vendorHash = \" <> oldVendorHash\n          original <- liftIO $ T.readFile derivationFile\n          _ <- lift $ File.replaceIO oldVendorHash \"null\" derivationFile\n          ok <- runExceptT $ Nix.build attrPath\n          _ <-\n            if isLeft ok\n              then do\n                _ <- liftIO $ T.writeFile derivationFile original\n                let fakeHash = Nix.fakeHashMatching oldVendorHash\n                _ <- lift $ File.replaceIO oldVendorHash (\"\\\"\" <> fakeHash <> \"\\\"\") derivationFile\n                newVendorHash <- Nix.getHashFromBuild attrPath\n                _ <- lift $ File.replaceIO fakeHash newVendorHash derivationFile\n                -- Note that on some small bumps, this may not actually change if go.sum did not\n                lift . log $ \"Replaced vendorHash with \" <> newVendorHash\n              else do\n                lift . log $ \"Set vendorHash to null\"\n          -- Ensure the package actually builds and passes its tests\n          Nix.build attrPath\n          lift $ log \"Finished updating vendorHash\"\n          return $ Just \"Golang update\"\n\n--------------------------------------------------------------------------------\n-- Rewrite NPM packages with buildNpmPackage\n-- This is basically `version` above, but with a second pass to also update the\n-- npmDepsHash vendor hash.\nnpmDepsVersion :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)\nnpmDepsVersion log args@Args {..} = do\n  if\n      | not (T.isInfixOf \"npmDepsHash\" derivationContents) -> do\n          lift $ log \"No npmDepsHash\"\n          return Nothing\n      | hasUpdateScript -> do\n          lift $ log \"skipping because derivation has updateScript\"\n          return Nothing\n      | otherwise -> do\n          -- This starts the same way `version` does, minus the assert\n          srcVersionFix args\n          -- But then from there we need to do this a second time for the cargoHash!\n          oldDepsHash <- Nix.getAttrString \"npmDepsHash\" attrPath\n          let fakeHash = Nix.fakeHashMatching oldDepsHash\n          _ <- lift $ File.replaceIO oldDepsHash fakeHash derivationFile\n          newDepsHash <- Nix.getHashFromBuild attrPath\n          when (oldDepsHash == newDepsHash) $ throwE (\"deps hashes equal; no update necessary: \" <> oldDepsHash)\n          lift . log $ \"Replacing npmDepsHash with \" <> newDepsHash\n          _ <- lift $ File.replaceIO fakeHash newDepsHash derivationFile\n          -- Ensure the package actually builds and passes its tests\n          Nix.build attrPath\n          lift $ log \"Finished updating NPM deps version and replacing hashes\"\n          return $ Just \"NPM version update\"\n\n--------------------------------------------------------------------------------\n\n-- Calls passthru.updateScript\nupdateScript :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)\nupdateScript log Args {..} = do\n  if hasUpdateScript\n    then do\n      (exitCode, msg) <- Nix.runUpdateScript attrPath\n      case exitCode of\n        ExitSuccess -> do\n          lift $ log \"Success\"\n          lift $ log msg\n          return $ Just \"Ran passthru.UpdateScript\"\n        ExitFailure num -> do\n          throwE $ \"[updateScript] Failed with exit code \" <> tshow num <> \"\\n\" <> msg\n    else do\n      lift $ log \"skipping because derivation has no updateScript\"\n      return Nothing\n\n--------------------------------------------------------------------------------\n-- Common helper functions and utilities\n-- Helper to update version and src attributes, re-computing the sha256.\n-- This is done by the generic version upgrader, but is also a sub-component of some of the others.\nsrcVersionFix :: MonadIO m => Args -> ExceptT Text m ()\nsrcVersionFix Args {..} = do\n  let UpdateEnv {..} = updateEnv\n  oldHash <- Nix.getHash attrPath\n  _ <- lift $ File.replaceIO oldVersion newVersion derivationFile\n  let fakeHash = Nix.fakeHashMatching oldHash\n  _ <- lift $ File.replaceIO oldHash fakeHash derivationFile\n  newHash <- Nix.getHashFromBuild attrPath\n  when (oldHash == newHash) $ throwE \"Hashes equal; no update necessary\"\n  _ <- lift $ File.replaceIO fakeHash newHash derivationFile\n  return ()\n"
  },
  {
    "path": "src/Skiplist.hs",
    "content": "{-# LANGUAGE FlexibleContexts #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE RankNTypes #-}\n\nmodule Skiplist\n  ( packageName,\n    content,\n    attrPath,\n    checkResult,\n    python,\n    skipOutpathCalc,\n  )\nwhere\n\nimport Data.Foldable (find)\nimport qualified Data.Text as T\nimport OurPrelude\n\ntype Skiplist = [(Text -> Bool, Text)]\n\ntype TextSkiplister m =\n  (MonadError Text m) =>\n  Text ->\n  m ()\n\nattrPath :: TextSkiplister m\nattrPath = skiplister attrPathList\n\npackageName :: TextSkiplister m\npackageName name =\n  if name == \"elementary-xfce-icon-theme\" -- https://github.com/nix-community/nixpkgs-update/issues/63\n    then return ()\n    else skiplister nameList name\n\ncontent :: TextSkiplister m\ncontent = skiplister contentList\n\ncheckResult :: TextSkiplister m\ncheckResult = skiplister checkResultList\n\nskipOutpathCalc :: TextSkiplister m\nskipOutpathCalc = skiplister skipOutpathCalcList\n\nattrPathList :: Skiplist\nattrPathList =\n  [ prefix \"lxqt\" \"Packages for lxqt are currently skipped.\",\n    prefix\n      \"altcoins.bitcoin\"\n      \"@roconnor asked for a skip on this until something can be done with GPG signatures https://github.com/NixOS/nixpkgs/commit/77f3ac7b7638b33ab198330eaabbd6e0a2e751a9\",\n    eq \"sqlite-interactive\" \"it is an override\",\n    eq \"harfbuzzFull\" \"it is an override\",\n    prefix\n      \"luanti-\"\n      \"luanti-server and luanti-client are different outputs for the luanti package\",\n    prefix\n      \"mate.\"\n      \"mate packages are upgraded in lockstep https://github.com/NixOS/nixpkgs/pull/50695#issuecomment-441338593\",\n    prefix\n      \"deepin\"\n      \"deepin packages are upgraded in lockstep https://github.com/NixOS/nixpkgs/pull/52327#issuecomment-447684194\",\n    prefix\n      \"rocmPackages\"\n      \"rocm packages are upgraded in lockstep https://github.com/NixOS/nixpkgs/issues/385294\",\n    prefix\n      \"monero-\"\n      \"monero-cli and monero-gui packages are upgraded in lockstep\",\n    prefix\n      \"element-desktop\"\n      \"@Ma27 asked to skip\",\n    prefix\n      \"element-web\"\n      \"has to be updated along with element-desktop\",\n    prefix\n      \"keybinder\"\n      \"it has weird tags. see nixpkgs-update#232\",\n    infixOf\n      \"pysc2\"\n      \"crashes nixpkgs-update\",\n    infixOf\n      \"tornado\"\n      \"python updatescript updates pinned versions\",\n    prefix\n      \"spire-\"\n      \"spire-server and spire-agent are different outputs for spire package\",\n    eq \"imagemagick_light\" \"same file and version as imagemagick\",\n    eq \"imagemagickBig\" \"same file and version as imagemagick\",\n    eq \"libheimdal\" \"alias of heimdal\",\n    eq \"minio_legacy_fs\" \"@bachp asked to skip\",\n    eq \"flint\" \"update repeatedly exceeded the 6h timeout\",\n    eq \"keepmenu\" \"update repeatedly exceeded the 6h timeout\",\n    eq \"klee\" \"update repeatedly exceeded the 6h timeout\",\n    eq \"dumbpipe\" \"update repeatedly exceeded the 6h timeout\",\n    eq \"python3Packages.aiosonic\" \"update repeatedly exceeded the 6h timeout\",\n    eq \"python3Packages.guidata\" \"update repeatedly exceeded the 6h timeout\",\n    eq \"router\" \"update repeatedly exceeded the 6h timeout\",\n    eq \"wifite2\" \"update repeatedly exceeded the 6h timeout\",\n    eq \"granian\" \"update repeatedly exceeded the 6h timeout\",\n    eq \"python3Packages.granian\" \"update repeatedly exceeded the 6h timeout\",\n    eq \"vlagent\" \"updates via victorialogs package\",\n    eq \"vmagent\" \"updates via victoriametrics package\",\n    eq \"qemu_full\" \"updates via qemu package\",\n    eq \"qemu_kvm\" \"updates via qemu package\",\n    eq \"qemu-user\" \"updates via qemu package\",\n    eq \"qemu-utils\" \"updates via qemu package\",\n    eq \"ollama-rocm\" \"only `ollama` is explicitly updated (defined in the same file)\",\n    eq \"ollama-cuda\" \"only `ollama` is explicitly updated (defined in the same file)\",\n    eq \"python3Packages.mmengine\" \"takes way too long to build\",\n    eq \"bitwarden-directory-connector-cli\" \"src is aliased to bitwarden-directory-connector\",\n    eq \"vaultwarden-mysql\" \"src is aliased to vaultwarden\",\n    eq \"vaultwarden-postgresql\" \"src is aliased to vaultwarden\",\n    eq \"dune\" \"same as dune_3\",\n    eq \"kanata-with-cmd\" \"same src as kanata\",\n    eq \"curlFull\" \"same as curl\",\n    eq \"curlMinimal\" \"same as curl\",\n    eq\n      \"azure-sdk-for-cpp.curl\"\n      \"same as curl\",\n    eq\n      \"yt-dlp-light\"\n      \"updates via yt-dlp\"\n  ]\n\nnameList :: Skiplist\nnameList =\n  [ prefix \"r-\" \"we don't know how to find the attrpath for these\",\n    infixOf \"jquery\" \"this isn't a real package\",\n    infixOf \"google-cloud-sdk\" \"complicated package\",\n    infixOf \"github-release\" \"complicated package\",\n    infixOf \"perl\" \"currently don't know how to update perl\",\n    infixOf \"cdrtools\" \"We keep downgrading this by accident.\",\n    infixOf \"gst\" \"gstreamer plugins are kept in lockstep.\",\n    infixOf \"electron\" \"multi-platform srcs in file.\",\n    infixOf \"xfce\" \"@volth asked to not update xfce\",\n    infixOf \"cmake-cursesUI-qt4UI\" \"Derivation file is complicated\",\n    infixOf \"iana-etc\" \"@mic92 takes care of this package\",\n    infixOf\n      \"checkbashism\"\n      \"needs to be fixed, see https://github.com/NixOS/nixpkgs/pull/39552\",\n    eq \"isl\" \"multi-version long building package\",\n    infixOf \"qscintilla\" \"https://github.com/nix-community/nixpkgs-update/issues/51\",\n    eq \"itstool\" \"https://github.com/NixOS/nixpkgs/pull/41339\",\n    infixOf\n      \"virtualbox\"\n      \"nixpkgs-update cannot handle updating the guest additions https://github.com/NixOS/nixpkgs/pull/42934\",\n    eq\n      \"avr-binutils\"\n      \"https://github.com/NixOS/nixpkgs/pull/43787#issuecomment-408649537\",\n    eq\n      \"iasl\"\n      \"two updates had to be reverted, https://github.com/NixOS/nixpkgs/pull/46272\",\n    eq\n      \"meson\"\n      \"https://github.com/NixOS/nixpkgs/pull/47024#issuecomment-423300633\",\n    eq\n      \"burp\"\n      \"skipped until better versioning schema https://github.com/NixOS/nixpkgs/pull/46298#issuecomment-419536301\",\n    eq \"chromedriver\" \"complicated package\",\n    eq\n      \"gitlab-shell\"\n      \"@globin asked to skip in https://github.com/NixOS/nixpkgs/pull/52294#issuecomment-447653417\",\n    eq\n      \"gitlab-workhorse\"\n      \"@globin asked to skip in https://github.com/NixOS/nixpkgs/pull/52286#issuecomment-447653409\",\n    eq\n      \"gitlab-elasticsearch-indexer\"\n      \"@yayayayaka asked to skip in https://github.com/NixOS/nixpkgs/pull/244074#issuecomment-1641657015\",\n    eq \"reposurgeon\" \"takes way too long to build\",\n    eq \"kodelife\" \"multiple system hashes need to be updated at once\",\n    eq \"openbazaar\" \"multiple system hashes need to be updated at once\",\n    eq \"stalwart-mail-enterprise\" \"stalwart-mail-enterprise follows stalwart-mail, that should be updated instead\",\n    eq \"eaglemode\" \"build hangs or takes way too long\",\n    eq \"autoconf\" \"@prusnak asked to skip\",\n    eq \"abseil-cpp\" \"@andersk asked to skip\",\n    eq \"_7zz-rar\" \"will be updated by _7zz proper\",\n    eq \"ncbi-vdb\" \"updating this alone breaks sratoolkit\",\n    eq \"sratoolkit\" \"tied to version of ncbi-vdb\",\n    eq \"libsignal-ffi\" \"must match the version required by mautrix-signal\",\n    eq\n      \"floorp\"\n      \"big package, does not update hashes correctly (https://github.com/NixOS/nixpkgs/pull/424715#issuecomment-3163626684)\",\n    eq\n      \"discord-ptb\"\n      \"updates through discord only https://github.com/NixOS/nixpkgs/issues/468956\",\n    eq\n      \"discord-canary\"\n      \"updates through discord only https://github.com/NixOS/nixpkgs/issues/468956\",\n    eq\n      \"discord-development\"\n      \"updates through discord only https://github.com/NixOS/nixpkgs/issues/468956\"\n  ]\n\ncontentList :: Skiplist\ncontentList =\n  [ infixOf \"nixpkgs-update: no auto update\" \"Derivation file opts-out of auto-updates\",\n    infixOf \"DO NOT EDIT\" \"Derivation file says not to edit it\",\n    infixOf \"Do not edit!\" \"Derivation file says not to edit it\",\n    -- Skip packages that have special builders\n    infixOf \"buildRustCrate\" \"Derivation contains buildRustCrate\",\n    infixOf \"buildRubyGem\" \"Derivation contains buildRubyGem\",\n    infixOf \"bundlerEnv\" \"Derivation contains bundlerEnv\",\n    infixOf \"buildPerlPackage\" \"Derivation contains buildPerlPackage\",\n    -- Specific skips for classes of packages\n    infixOf \"teams.gnome\" \"Do not update GNOME during a release cycle\",\n    infixOf \"https://downloads.haskell.org/ghc/\" \"GHC packages are versioned per file\"\n  ]\n\ncheckResultList :: Skiplist\ncheckResultList =\n  [ infixOf\n      \"busybox\"\n      \"- busybox result is not automatically checked, because some binaries kill the shell\",\n    infixOf\n      \"gjs\"\n      \"- gjs result is not automatically checked, because some tests take a long time to run\",\n    infixOf\n      \"casperjs\"\n      \"- casperjs result is not automatically checked, because some tests take a long time to run\",\n    binariesStickAround \"kicad\",\n    binariesStickAround \"fcitx\",\n    binariesStickAround \"x2goclient\",\n    binariesStickAround \"gpg-agent\",\n    binariesStickAround \"dirmngr\",\n    binariesStickAround \"barrier\",\n    binariesStickAround \"fail2ban\",\n    binariesStickAround \"zed\",\n    binariesStickAround \"haveged\"\n  ]\n\nskipOutpathCalcList :: Skiplist\nskipOutpathCalcList =\n  [ eq \"firefox-beta-bin-unwrapped\" \"master\",\n    eq \"firefox-devedition-bin-unwrapped\" \"master\",\n    -- \"firefox-release-bin-unwrapped\" is unneeded here because firefox-bin is a dependency of other packages that Hydra doesn't ignore.\n    prefix \"linuxKernel.kernels\" \"master\",\n    eq \"bmake\" \"staging\" -- mass rebuild only on darwin\n  ]\n\nbinariesStickAround :: Text -> (Text -> Bool, Text)\nbinariesStickAround name =\n  infixOf name (\"- \" <> name <> \" result is not automatically checked because some binaries stick around\")\n\nskiplister :: Skiplist -> TextSkiplister m\nskiplister skiplist input = forM_ result throwError\n  where\n    result = snd <$> find (\\(isSkiplisted, _) -> isSkiplisted input) skiplist\n\nprefix :: Text -> Text -> (Text -> Bool, Text)\nprefix part reason = ((part `T.isPrefixOf`), reason)\n\ninfixOf :: Text -> Text -> (Text -> Bool, Text)\ninfixOf part reason = ((part `T.isInfixOf`), reason)\n\neq :: Text -> Text -> (Text -> Bool, Text)\neq part reason = ((part ==), reason)\n\npython :: Monad m => Int -> Text -> ExceptT Text m ()\npython numPackageRebuilds derivationContents =\n  tryAssert\n    ( \"Python package with too many package rebuilds \"\n        <> (T.pack . show) numPackageRebuilds\n        <> \"  > \"\n        <> tshow maxPackageRebuild\n    )\n    (not isPython || numPackageRebuilds <= maxPackageRebuild)\n  where\n    isPython = \"buildPythonPackage\" `T.isInfixOf` derivationContents\n    maxPackageRebuild = 100\n"
  },
  {
    "path": "src/Update.hs",
    "content": "{-# LANGUAGE ExtendedDefaultRules #-}\n{-# LANGUAGE FlexibleContexts #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE QuasiQuotes #-}\n{-# LANGUAGE RecordWildCards #-}\n{-# LANGUAGE ScopedTypeVariables #-}\n{-# LANGUAGE TemplateHaskell #-}\n{-# OPTIONS_GHC -fno-warn-type-defaults #-}\n\nmodule Update\n  ( addPatched,\n    assertNotUpdatedOn,\n    cveAll,\n    cveReport,\n    prMessage,\n    sourceGithubAll,\n    updatePackage,\n  )\nwhere\n\nimport CVE (CVE, cveID, cveLI)\nimport qualified Check\nimport Control.Exception (bracket)\nimport Control.Monad.Writer (execWriterT, tell)\nimport Data.Maybe (fromJust)\nimport Data.Monoid (Alt (..))\nimport qualified Data.Set as S\nimport qualified Data.Text as T\nimport qualified Data.Text.IO as T\nimport Data.Time.Calendar (showGregorian)\nimport Data.Time.Clock (getCurrentTime, utctDay)\nimport qualified Data.Vector as V\nimport qualified GH\nimport qualified Git\nimport NVD (getCVEs, withVulnDB)\nimport qualified Nix\nimport qualified NixpkgsReview\nimport OurPrelude\nimport qualified Outpaths\nimport qualified Rewrite\nimport qualified Skiplist\nimport System.Directory (doesDirectoryExist, withCurrentDirectory)\nimport System.Posix.Directory (createDirectory)\nimport Utils\n  ( Boundary (..),\n    Options (..),\n    UpdateEnv (..),\n    VersionMatcher (..),\n    branchName,\n    logDir,\n    parseUpdates,\n    prTitle,\n    whenBatch,\n  )\nimport qualified Utils as U\nimport qualified Version\nimport Prelude hiding (log)\n\ndefault (T.Text)\n\nalsoLogToAttrPath :: Text -> (Text -> IO ()) -> Text -> IO (Text -> IO ())\nalsoLogToAttrPath attrPath topLevelLog url = do\n  logFile <- attrPathLogFilePath attrPath\n  T.appendFile logFile $ \"Running nixpkgs-update (\" <> url <> \") with UPDATE_INFO: \"\n  let attrPathLog = log' logFile\n  return \\text -> do\n    topLevelLog text\n    attrPathLog text\n\nlog' :: MonadIO m => FilePath -> Text -> m ()\nlog' logFile msg = liftIO $ T.appendFile logFile (msg <> \"\\n\")\n\nattrPathLogFilePath :: Text -> IO String\nattrPathLogFilePath attrPath = do\n  lDir <- logDir\n  now <- getCurrentTime\n  let dir = lDir <> \"/\" <> T.unpack attrPath\n  dirExists <- doesDirectoryExist dir\n  unless\n    dirExists\n    (createDirectory dir U.regDirMode)\n  let logFile = dir <> \"/\" <> showGregorian (utctDay now) <> \".log\"\n  putStrLn (\"For attrpath \" <> T.unpack attrPath <> \", using log file: \" <> logFile)\n  return logFile\n\nnotifyOptions :: (Text -> IO ()) -> Options -> IO ()\nnotifyOptions log o = do\n  let repr f = if f o then \"YES\" else \"NO\"\n  let ghUser = GH.untagName . githubUser $ o\n  let pr = repr doPR\n  let batch = repr batchUpdate\n  let outpaths = repr calculateOutpaths\n  let cve = repr makeCVEReport\n  let review = repr runNixpkgsReview\n  let exactAttrPath = repr U.attrpath\n  npDir <- tshow <$> Git.nixpkgsDir\n  log $\n    [interpolate| [options] github_user: $ghUser, pull_request: $pr, batch_update: $batch, calculate_outpaths: $outpaths, cve_report: $cve, nixpkgs-review: $review, nixpkgs_dir: $npDir, use attrpath: $exactAttrPath|]\n\ncveAll :: Options -> Text -> IO ()\ncveAll o updates = do\n  let u' = rights $ parseUpdates updates\n  results <-\n    mapM\n      ( \\(p, oldV, newV, url) -> do\n          r <- cveReport (UpdateEnv p oldV newV url o)\n          return $ p <> \": \" <> oldV <> \" -> \" <> newV <> \"\\n\" <> r\n      )\n      u'\n  T.putStrLn (T.unlines results)\n\nsourceGithubAll :: Options -> Text -> IO ()\nsourceGithubAll o updates = do\n  let u' = rights $ parseUpdates updates\n  _ <-\n    runExceptT $ do\n      Git.fetchIfStale <|> liftIO (T.putStrLn \"Failed to fetch.\")\n      Git.cleanAndResetTo \"master\"\n  mapM_\n    ( \\(p, oldV, newV, url) -> do\n        let updateEnv = UpdateEnv p oldV newV url o\n        runExceptT $ do\n          attrPath <- Nix.lookupAttrPath updateEnv\n          srcUrl <- Nix.getSrcUrl attrPath\n          v <- GH.latestVersion updateEnv srcUrl\n          if v /= newV\n            then\n              liftIO $\n                T.putStrLn $\n                  p <> \": \" <> oldV <> \" -> \" <> newV <> \" -> \" <> v\n            else return ()\n    )\n    u'\n\ndata UpdatePackageResult = UpdatePackageSuccess | UpdatePackageFailure\n\n-- Arguments this function should have to make it testable:\n-- - the merge base commit (should be updated externally to this function)\n-- - the commit for branches: master, staging, staging-next, staging-nixos\nupdatePackageBatch ::\n  (Text -> IO ()) ->\n  Text ->\n  UpdateEnv ->\n  IO UpdatePackageResult\nupdatePackageBatch simpleLog updateInfoLine updateEnv@UpdateEnv {..} = do\n  eitherFailureOrAttrpath <- runExceptT $ do\n    -- Filters that don't need git\n    whenBatch updateEnv do\n      Skiplist.packageName packageName\n      -- Update our git checkout\n      Git.fetchIfStale <|> liftIO (T.putStrLn \"Failed to fetch.\")\n\n    -- Filters: various cases where we shouldn't update the package\n    if attrpath options\n      then return packageName\n      else Nix.lookupAttrPath updateEnv\n  let url =\n        if isBot updateEnv\n          then \"https://nix-community.org/update-bot/\"\n          else \"https://github.com/nix-community/nixpkgs-update\"\n  case eitherFailureOrAttrpath of\n    Left failure -> do\n      simpleLog failure\n      return UpdatePackageFailure\n    Right foundAttrPath -> do\n      log <- alsoLogToAttrPath foundAttrPath simpleLog url\n      log updateInfoLine\n      mergeBase <-\n        if batchUpdate options\n          then Git.mergeBase\n          else pure \"HEAD\"\n      withWorktree mergeBase foundAttrPath updateEnv $\n        updateAttrPath log mergeBase updateEnv foundAttrPath\n\ncheckExistingUpdate ::\n  (Text -> IO ()) ->\n  UpdateEnv ->\n  Maybe Text ->\n  Text ->\n  ExceptT Text IO ()\ncheckExistingUpdate log updateEnv existingCommitMsg attrPath = do\n  case existingCommitMsg of\n    Nothing -> lift $ log \"No auto update branch exists\"\n    Just msg -> do\n      let nV = newVersion updateEnv\n      lift $\n        log\n          [interpolate|An auto update branch exists with message `$msg`. New version is $nV.|]\n\n      case U.titleVersion msg of\n        Just branchV\n          | Version.matchVersion (RangeMatcher (Including nV) Unbounded) branchV ->\n              throwError \"An auto update branch exists with an equal or greater version\"\n        _ ->\n          lift $ log \"The auto update branch does not match or exceed the new version.\"\n\n  -- Note that this check looks for PRs with the same old and new\n  -- version numbers, so it won't stop us from updating an existing PR\n  -- if this run updates the package to a newer version.\n  GH.checkExistingUpdatePR updateEnv attrPath\n\nupdateAttrPath ::\n  (Text -> IO ()) ->\n  Text ->\n  UpdateEnv ->\n  Text ->\n  IO UpdatePackageResult\nupdateAttrPath log mergeBase updateEnv@UpdateEnv {..} attrPath = do\n  log $ \"attrpath: \" <> attrPath\n  let pr = doPR options\n\n  successOrFailure <- runExceptT $ do\n    hasUpdateScript <- Nix.hasUpdateScript attrPath\n\n    existingCommitMsg <- fmap getAlt . execWriterT $\n      whenBatch updateEnv do\n        Skiplist.attrPath attrPath\n        when pr do\n          liftIO $ log \"Checking auto update branch...\"\n          mbLastCommitMsg <- lift $ Git.findAutoUpdateBranchMessage packageName\n          tell $ Alt mbLastCommitMsg\n          unless hasUpdateScript do\n            lift $ checkExistingUpdate log updateEnv mbLastCommitMsg attrPath\n\n    unless hasUpdateScript do\n      Nix.assertNewerVersion updateEnv\n      Version.assertCompatibleWithPathPin updateEnv attrPath\n\n    let skipOutpathBase = either Just (const Nothing) $ Skiplist.skipOutpathCalc packageName\n\n    derivationFile <- Nix.getDerivationFile attrPath\n    unless hasUpdateScript do\n      assertNotUpdatedOn updateEnv derivationFile \"master\"\n      assertNotUpdatedOn updateEnv derivationFile \"staging\"\n      assertNotUpdatedOn updateEnv derivationFile \"staging-next\"\n      assertNotUpdatedOn updateEnv derivationFile \"staging-nixos\"\n\n    -- Calculate output paths for rebuilds and our merge base\n    let calcOutpaths = calculateOutpaths options && isNothing skipOutpathBase\n    mergeBaseOutpathSet <-\n      if calcOutpaths\n        then Outpaths.currentOutpathSet\n        else return $ Outpaths.dummyOutpathSetBefore attrPath\n\n    -- Get the original values for diffing purposes\n    derivationContents <- liftIO $ T.readFile $ T.unpack derivationFile\n    oldHash <- Nix.getHash attrPath <|> pure \"\"\n    oldSrcUrl <- Nix.getSrcUrl attrPath <|> pure \"\"\n    oldRev <- Nix.getAttrString \"rev\" attrPath <|> pure \"\"\n    oldVerMay <- rightMay `fmapRT` (lift $ runExceptT $ Nix.getAttrString \"version\" attrPath)\n\n    tryAssert\n      \"The derivation has no 'version' attribute, so do not know how to figure out the version while doing an updateScript update\"\n      (not hasUpdateScript || isJust oldVerMay)\n\n    -- One final filter\n    Skiplist.content derivationContents\n\n    ----------------------------------------------------------------------------\n    -- UPDATES\n    --\n    -- At this point, we've stashed the old derivation contents and\n    -- validated that we actually should be rewriting something. Get\n    -- to work processing the various rewrite functions!\n    rewriteMsgs <- Rewrite.runAll log Rewrite.Args {derivationFile = T.unpack derivationFile, ..}\n    ----------------------------------------------------------------------------\n\n    -- Compute the diff and get updated values\n    diffAfterRewrites <- Git.diff mergeBase\n    tryAssert\n      \"The diff was empty after rewrites.\"\n      (diffAfterRewrites /= T.empty)\n    lift . log $ \"Diff after rewrites:\\n\" <> diffAfterRewrites\n    updatedDerivationContents <- liftIO $ T.readFile $ T.unpack derivationFile\n    newSrcUrl <- Nix.getSrcUrl attrPath <|> pure \"\"\n    newHash <- Nix.getHash attrPath <|> pure \"\"\n    newRev <- Nix.getAttrString \"rev\" attrPath <|> pure \"\"\n    newVerMay <- rightMay `fmapRT` (lift $ runExceptT $ Nix.getAttrString \"version\" attrPath)\n\n    tryAssert\n      \"The derivation has no 'version' attribute, so do not know how to figure out the version while doing an updateScript update\"\n      (not hasUpdateScript || isJust newVerMay)\n\n    -- Sanity checks to make sure the PR is worth opening\n    unless hasUpdateScript do\n      when (derivationContents == updatedDerivationContents) $ throwE \"No rewrites performed on derivation.\"\n      when (oldSrcUrl /= \"\" && oldSrcUrl == newSrcUrl) $ throwE \"Source url did not change. \"\n      when (oldHash /= \"\" && oldHash == newHash) $ throwE \"Hashes equal; no update necessary\"\n      when (oldRev /= \"\" && oldRev == newRev) $ throwE \"rev equal; no update necessary\"\n\n    --\n    -- Update updateEnv if using updateScript\n    updateEnv' <-\n      if hasUpdateScript\n        then do\n          -- Already checked that these are Just above.\n          let oldVer = fromJust oldVerMay\n          let newVer = fromJust newVerMay\n\n          -- Some update scripts make file changes but don't update the package\n          -- version; ignore these updates (#388)\n          when (newVer == oldVer) $ throwE \"Package version did not change.\"\n\n          return $\n            UpdateEnv\n              packageName\n              oldVer\n              newVer\n              (Just \"passthru.updateScript\")\n              options\n        else return updateEnv\n\n    whenBatch updateEnv do\n      when pr do\n        when hasUpdateScript do\n          checkExistingUpdate log updateEnv' existingCommitMsg attrPath\n\n    when hasUpdateScript do\n      changedFiles <- Git.diffFileNames mergeBase\n      let rewrittenFile = case changedFiles of [f] -> f; _ -> derivationFile\n      assertNotUpdatedOn updateEnv' rewrittenFile \"master\"\n      assertNotUpdatedOn updateEnv' rewrittenFile \"staging\"\n      assertNotUpdatedOn updateEnv' rewrittenFile \"staging-next\"\n      assertNotUpdatedOn updateEnv' rewrittenFile \"staging-nixos\"\n\n    --\n    -- Outpaths\n    -- this sections is very slow\n    editedOutpathSet <- if calcOutpaths then Outpaths.currentOutpathSetUncached else return $ Outpaths.dummyOutpathSetAfter attrPath\n    let opDiff = S.difference mergeBaseOutpathSet editedOutpathSet\n    let numPRebuilds = Outpaths.numPackageRebuilds opDiff\n    whenBatch updateEnv do\n      Skiplist.python numPRebuilds derivationContents\n    when (numPRebuilds == 0) (throwE \"Update edits cause no rebuilds.\")\n    -- end outpaths section\n\n    Nix.build attrPath\n\n    --\n    -- Publish the result\n    lift . log $ \"Successfully finished processing\"\n    result <- Nix.resultLink\n    let opReport =\n          if isJust skipOutpathBase\n            then \"Outpath calculations were skipped for this package; total number of rebuilds unknown.\"\n            else Outpaths.outpathReport opDiff\n    let prBase =\n          flip\n            fromMaybe\n            skipOutpathBase\n            if Outpaths.numPackageRebuilds opDiff <= 500\n              then\n                if any (T.isInfixOf \"nixosTests.simple\") (V.toList $ Outpaths.packageRebuilds opDiff)\n                  then \"staging-nixos\"\n                  else \"master\"\n              else \"staging\"\n    publishPackage log updateEnv' oldSrcUrl newSrcUrl attrPath result opReport prBase rewriteMsgs (isJust existingCommitMsg)\n\n  case successOrFailure of\n    Left failure -> do\n      log failure\n      return UpdatePackageFailure\n    Right () -> return UpdatePackageSuccess\n\npublishPackage ::\n  (Text -> IO ()) ->\n  UpdateEnv ->\n  Text ->\n  Text ->\n  Text ->\n  Text ->\n  Text ->\n  Text ->\n  [Text] ->\n  Bool ->\n  ExceptT Text IO ()\npublishPackage log updateEnv oldSrcUrl newSrcUrl attrPath result opReport prBase rewriteMsgs branchExists = do\n  cacheTestInstructions <- doCache log updateEnv result\n  resultCheckReport <-\n    case Skiplist.checkResult (packageName updateEnv) of\n      Right () -> lift $ Check.result updateEnv (T.unpack result)\n      Left msg -> pure msg\n  metaDescription <- Nix.getDescription attrPath <|> return T.empty\n  metaHomepage <- Nix.getHomepage attrPath <|> return T.empty\n  metaChangelog <- Nix.getChangelog attrPath <|> return T.empty\n  cveRep <- liftIO $ cveReport updateEnv\n  releaseUrl <- GH.releaseUrl updateEnv newSrcUrl <|> return \"\"\n  compareUrl <- GH.compareUrl oldSrcUrl newSrcUrl <|> return \"\"\n  maintainers <- Nix.getMaintainers attrPath\n  let commitMsg = commitMessage updateEnv attrPath\n  Git.commit commitMsg\n  commitRev <- Git.headRev\n  nixpkgsReviewMsg <-\n    if prBase /= \"staging\" && (runNixpkgsReview . options $ updateEnv)\n      then liftIO $ NixpkgsReview.runReport log commitRev\n      else return \"\"\n  -- Try to push it three times\n  -- (these pushes use --force, so it doesn't matter if branchExists is True)\n  when\n    (doPR . options $ updateEnv)\n    (Git.push updateEnv <|> Git.push updateEnv <|> Git.push updateEnv)\n  let prMsg =\n        prMessage\n          updateEnv\n          metaDescription\n          metaHomepage\n          metaChangelog\n          rewriteMsgs\n          releaseUrl\n          compareUrl\n          resultCheckReport\n          commitRev\n          attrPath\n          maintainers\n          result\n          opReport\n          cveRep\n          cacheTestInstructions\n          nixpkgsReviewMsg\n  liftIO $ log prMsg\n  if (doPR . options $ updateEnv)\n    then do\n      let ghUser = GH.untagName . githubUser . options $ updateEnv\n      let mkPR = if branchExists then GH.prUpdate else GH.pr\n      (reusedPR, pullRequestUrl) <- mkPR updateEnv (prTitle updateEnv attrPath) prMsg (ghUser <> \":\" <> (branchName updateEnv)) prBase\n      when branchExists $\n        liftIO $\n          log\n            if reusedPR\n              then \"Updated existing PR\"\n              else \"Reused existing auto update branch, but no corresponding open PR was found, so created a new PR\"\n      liftIO $ log pullRequestUrl\n    else liftIO $ T.putStrLn prMsg\n\ncommitMessage :: UpdateEnv -> Text -> Text\ncommitMessage updateEnv attrPath = prTitle updateEnv attrPath\n\nprMessage ::\n  UpdateEnv ->\n  Text ->\n  Text ->\n  Text ->\n  [Text] ->\n  Text ->\n  Text ->\n  Text ->\n  Text ->\n  Text ->\n  Text ->\n  Text ->\n  Text ->\n  Text ->\n  Text ->\n  Text ->\n  Text\nprMessage updateEnv metaDescription metaHomepage metaChangelog rewriteMsgs releaseUrl compareUrl resultCheckReport commitRev attrPath maintainers resultPath opReport cveRep cacheTestInstructions nixpkgsReviewMsg =\n  -- Some components of the PR description are pre-generated prior to calling\n  -- because they require IO, but in general try to put as much as possible for\n  -- the formatting into the pure function so that we can control the body\n  -- formatting in one place and unit test it.\n  let metaHomepageLine =\n        if metaHomepage == T.empty\n          then \"\"\n          else \"meta.homepage for \" <> attrPath <> \" is: \" <> metaHomepage\n      metaDescriptionLine =\n        if metaDescription == T.empty\n          then \"\"\n          else \"meta.description for \" <> attrPath <> \" is: \" <> metaDescription\n      metaChangelogLine =\n        if metaChangelog == T.empty\n          then \"\"\n          else \"meta.changelog for \" <> attrPath <> \" is: \" <> metaChangelog\n      rewriteMsgsLine = foldl (\\ms m -> ms <> T.pack \"\\n- \" <> m) \"\\n###### Updates performed\" rewriteMsgs\n      maintainersCc =\n        if not (T.null maintainers)\n          then \"cc \" <> maintainers <> \" for [testing](https://github.com/nix-community/nixpkgs-update/blob/main/doc/nixpkgs-maintainer-faq.md#r-ryantm-opened-a-pr-for-my-package-what-do-i-do).\"\n          else \"\"\n      releaseUrlMessage =\n        if releaseUrl == T.empty\n          then \"\"\n          else \"- [Release on GitHub](\" <> releaseUrl <> \")\"\n      compareUrlMessage =\n        if compareUrl == T.empty\n          then \"\"\n          else \"- [Compare changes on GitHub](\" <> compareUrl <> \")\"\n      nixpkgsReviewSection =\n        if nixpkgsReviewMsg == T.empty\n          then \"Nixpkgs review skipped\"\n          else\n            [interpolate|\n            We have automatically built all packages that will get rebuilt due to\n            this change.\n\n            This gives evidence on whether the upgrade will break dependent packages.\n            Note sometimes packages show up as _failed to build_ independent of the\n            change, simply because they are already broken on the target branch.\n\n            $nixpkgsReviewMsg\n            |]\n      pat link = [interpolate|This update was made based on information from $link.|]\n      sourceLinkInfo = maybe \"\" pat $ sourceURL updateEnv\n      ghUser = GH.untagName . githubUser . options $ updateEnv\n      batch = batchUpdate . options $ updateEnv\n      automatic = if batch then \"Automatic\" else \"Semi-automatic\"\n   in [interpolate|\n       $automatic update generated by [nixpkgs-update](https://github.com/nix-community/nixpkgs-update) tools. $sourceLinkInfo\n\n       $metaDescriptionLine\n\n       $metaHomepageLine\n\n       $metaChangelogLine\n\n       $rewriteMsgsLine\n\n       ###### To inspect upstream changes\n\n       $releaseUrlMessage\n\n       $compareUrlMessage\n\n       ###### Impact\n\n       <b>Checks done</b>\n\n       ---\n\n       - built on NixOS\n       $resultCheckReport\n\n       ---\n\n       <details>\n       <summary>\n       <b>Rebuild report</b> (if merged into master) (click to expand)\n       </summary>\n\n       ```\n       $opReport\n       ```\n\n       </details>\n\n       <details>\n       <summary>\n       <b>Instructions to test this update</b> (click to expand)\n       </summary>\n\n       ---\n\n       $cacheTestInstructions\n       ```\n       nix-build -A $attrPath https://github.com/$ghUser/nixpkgs/archive/$commitRev.tar.gz\n       ```\n       Or:\n       ```\n       nix build github:$ghUser/nixpkgs/$commitRev#$attrPath\n       ```\n\n       After you've downloaded or built it, look at the files and if there are any, run the binaries:\n       ```\n       ls -la $resultPath\n       ls -la $resultPath/bin\n       ```\n\n       ---\n\n       </details>\n       <br/>\n\n       $cveRep\n\n       ### Pre-merge build results\n\n       $nixpkgsReviewSection\n\n       ---\n\n       ###### Maintainer pings\n\n       $maintainersCc\n\n       > [!TIP]\n       > As a maintainer, if your package is located under `pkgs/by-name/*`, you can comment **`@NixOS/nixpkgs-merge-bot merge`** to automatically merge this update using the [`nixpkgs-merge-bot`](https://github.com/NixOS/nixpkgs/blob/master/ci/README.md#nixpkgs-merge-bot).\n    |]\n\nassertNotUpdatedOn ::\n  MonadIO m => UpdateEnv -> Text -> Text -> ExceptT Text m ()\nassertNotUpdatedOn updateEnv derivationFile branch = do\n  derivationContents <- Git.show branch derivationFile\n  Nix.assertOldVersionOn updateEnv branch derivationContents\n\naddPatched :: Text -> Set CVE -> IO [(CVE, Bool)]\naddPatched attrPath set = do\n  let list = S.toList set\n  forM\n    list\n    ( \\cve -> do\n        patched <- runExceptT $ Nix.hasPatchNamed attrPath (cveID cve)\n        let p =\n              case patched of\n                Left _ -> False\n                Right r -> r\n        return (cve, p)\n    )\n\ncveReport :: UpdateEnv -> IO Text\ncveReport updateEnv =\n  if not (makeCVEReport . options $ updateEnv)\n    then return \"\"\n    else withVulnDB $ \\conn -> do\n      let pname1 = packageName updateEnv\n      let pname2 = T.replace \"-\" \"_\" pname1\n      oldCVEs1 <- getCVEs conn pname1 (oldVersion updateEnv)\n      oldCVEs2 <- getCVEs conn pname2 (oldVersion updateEnv)\n      let oldCVEs = S.fromList (oldCVEs1 ++ oldCVEs2)\n      newCVEs1 <- getCVEs conn pname1 (newVersion updateEnv)\n      newCVEs2 <- getCVEs conn pname2 (newVersion updateEnv)\n      let newCVEs = S.fromList (newCVEs1 ++ newCVEs2)\n      let inOldButNotNew = S.difference oldCVEs newCVEs\n          inNewButNotOld = S.difference newCVEs oldCVEs\n          inBoth = S.intersection oldCVEs newCVEs\n          ifEmptyNone t =\n            if t == T.empty\n              then \"none\"\n              else t\n      inOldButNotNew' <- addPatched (packageName updateEnv) inOldButNotNew\n      inNewButNotOld' <- addPatched (packageName updateEnv) inNewButNotOld\n      inBoth' <- addPatched (packageName updateEnv) inBoth\n      let toMkdownList = fmap (uncurry cveLI) >>> T.unlines >>> ifEmptyNone\n          fixedList = toMkdownList inOldButNotNew'\n          newList = toMkdownList inNewButNotOld'\n          unresolvedList = toMkdownList inBoth'\n      if fixedList == \"none\" && unresolvedList == \"none\" && newList == \"none\"\n        then return \"\"\n        else\n          return\n            [interpolate|\n      ###### Security vulnerability report\n\n      <details>\n      <summary>\n      Security report (click to expand)\n      </summary>\n\n      CVEs resolved by this update:\n      $fixedList\n\n      CVEs introduced by this update:\n      $newList\n\n      CVEs present in both versions:\n      $unresolvedList\n\n\n       </details>\n       <br/>\n      |]\n\nisBot :: UpdateEnv -> Bool\nisBot updateEnv =\n  let o = options updateEnv\n   in batchUpdate o && \"r-ryantm\" == (GH.untagName $ githubUser o)\n\ndoCache :: MonadIO m => (Text -> m ()) -> UpdateEnv -> Text -> ExceptT Text m Text\ndoCache log updateEnv resultPath =\n  if isBot updateEnv\n    then do\n      return\n        [interpolate|\n       Either **download from the cache**:\n       ```\n       nix-store -r $resultPath \\\n         --option binary-caches 'https://cache.nixos.org/ https://nixpkgs-update-cache.nix-community.org/' \\\n         --option trusted-public-keys '\n         nixpkgs-update-cache.nix-community.org-1:U8d6wiQecHUPJFSqHN9GSSmNkmdiFW7GW7WNAnHW0SM=\n         cache.nixos.org-1:6NCHdD59X431o0gWypbMrAURkbJ16ZPMQFGspcDShjY=\n         '\n       ```\n       (The nixpkgs-update cache is only trusted for this store-path realization.)\n       For the cached download to work, your user must be in the `trusted-users` list or you can use `sudo` since root is effectively trusted.\n\n       Or, **build yourself**:\n       |]\n    else do\n      lift $ log \"skipping cache\"\n      return \"Build yourself:\"\n\nupdatePackage ::\n  Options ->\n  Text ->\n  IO ()\nupdatePackage o updateInfo = do\n  let (p, oldV, newV, url) = head (rights (parseUpdates updateInfo))\n  let updateInfoLine = (p <> \" \" <> oldV <> \" -> \" <> newV <> fromMaybe \"\" (fmap (\" \" <>) url))\n  let updateEnv = UpdateEnv p oldV newV url o\n  let log = T.putStrLn\n  liftIO $ notifyOptions log o\n  updated <- updatePackageBatch log updateInfoLine updateEnv\n  case updated of\n    UpdatePackageFailure -> do\n      log $ \"[result] Failed to update \" <> updateInfoLine\n    UpdatePackageSuccess -> do\n      log $ \"[result] Success updating \" <> updateInfoLine\n\nwithWorktree :: Text -> Text -> UpdateEnv -> IO a -> IO a\nwithWorktree branch attrpath updateEnv action = do\n  bracket\n    ( do\n        dir <- U.worktreeDir\n        let path = dir <> \"/\" <> T.unpack (T.replace \".lock\" \"_lock\" attrpath)\n        Git.worktreeRemove path\n        Git.delete1 (branchName updateEnv)\n        Git.worktreeAdd path branch updateEnv\n        pure path\n    )\n    ( \\path -> do\n        Git.worktreeRemove path\n        Git.delete1 (branchName updateEnv)\n    )\n    (\\path -> withCurrentDirectory path action)\n"
  },
  {
    "path": "src/Utils.hs",
    "content": "{-# LANGUAGE ExtendedDefaultRules #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE QuasiQuotes #-}\n{-# OPTIONS_GHC -fno-warn-type-defaults #-}\n\nmodule Utils\n  ( Boundary (..),\n    Options (..),\n    ProductID,\n    URL,\n    UpdateEnv (..),\n    Version,\n    VersionMatcher (..),\n    branchName,\n    branchPrefix,\n    getGithubToken,\n    getGithubUser,\n    logDir,\n    nixBuildOptions,\n    nixCommonOptions,\n    parseUpdates,\n    prTitle,\n    runLog,\n    srcOrMain,\n    titleVersion,\n    whenBatch,\n    regDirMode,\n    outpathCacheDir,\n    cacheDir,\n    worktreeDir,\n  )\nwhere\n\nimport Data.Bits ((.|.))\nimport Data.Maybe (fromJust)\nimport qualified Data.Text as T\nimport qualified Data.Text.IO as T\nimport Database.SQLite.Simple (ResultError (..), SQLData (..))\nimport Database.SQLite.Simple.FromField\n  ( FieldParser,\n    FromField,\n    fromField,\n    returnError,\n  )\nimport Database.SQLite.Simple.Internal (Field (..))\nimport Database.SQLite.Simple.Ok (Ok (..))\nimport Database.SQLite.Simple.ToField (ToField, toField)\nimport qualified GitHub as GH\nimport OurPrelude\nimport Polysemy.Output\nimport System.Directory (createDirectoryIfMissing, doesDirectoryExist)\nimport System.Environment (lookupEnv)\nimport System.Posix.Directory (createDirectory)\nimport System.Posix.Env (getEnv)\nimport System.Posix.Files\n  ( directoryMode,\n    fileExist,\n    groupModes,\n    otherExecuteMode,\n    otherReadMode,\n    ownerModes,\n  )\nimport System.Posix.Temp (mkdtemp)\nimport System.Posix.Types (FileMode)\nimport Text.Read (readEither)\nimport Type.Reflection (Typeable)\n\ndefault (T.Text)\n\ntype ProductID = Text\n\ntype Version = Text\n\ntype URL = Text\n\n-- | The Ord instance is used to sort lists of matchers in order to compare them\n-- as a set, it is not useful for comparing bounds since the ordering of bounds\n-- depends on whether it is a start or end bound.\ndata Boundary a\n  = Unbounded\n  | Including a\n  | Excluding a\n  deriving (Eq, Ord, Show, Read)\n\n-- | The Ord instance is used to sort lists of matchers in order to compare them\n-- as a set, it is not useful for comparing versions.\ndata VersionMatcher\n  = SingleMatcher Version\n  | RangeMatcher (Boundary Version) (Boundary Version)\n  deriving (Eq, Ord, Show, Read)\n\nreadField :: (Read a, Typeable a) => FieldParser a\nreadField f@(Field (SQLText t) _) =\n  case readEither (T.unpack t) of\n    Right x -> Ok x\n    Left e -> returnError ConversionFailed f $ \"read error: \" <> e\nreadField f = returnError ConversionFailed f \"expecting SQLText column type\"\n\nshowField :: Show a => a -> SQLData\nshowField = toField . show\n\ninstance FromField VersionMatcher where\n  fromField = readField\n\ninstance ToField VersionMatcher where\n  toField = showField\n\ndata Options = Options\n  { doPR :: Bool,\n    batchUpdate :: Bool,\n    githubUser :: GH.Name GH.Owner,\n    githubToken :: Text,\n    makeCVEReport :: Bool,\n    runNixpkgsReview :: Bool,\n    calculateOutpaths :: Bool,\n    attrpath :: Bool\n  }\n  deriving (Show)\n\ndata UpdateEnv = UpdateEnv\n  { packageName :: Text,\n    oldVersion :: Version,\n    newVersion :: Version,\n    sourceURL :: Maybe URL,\n    options :: Options\n  }\n\nwhenBatch :: Applicative f => UpdateEnv -> f () -> f ()\nwhenBatch updateEnv = when (batchUpdate . options $ updateEnv)\n\nprTitle :: UpdateEnv -> Text -> Text\nprTitle updateEnv attrPath =\n  let oV = oldVersion updateEnv\n      nV = newVersion updateEnv\n   in T.strip [interpolate| $attrPath: $oV -> $nV |]\n\ntitleVersion :: Text -> Maybe Version\ntitleVersion title = if T.null prefix then Nothing else Just suffix\n  where\n    (prefix, suffix) = T.breakOnEnd \" -> \" title\n\nregDirMode :: FileMode\nregDirMode =\n  directoryMode\n    .|. ownerModes\n    .|. groupModes\n    .|. otherReadMode\n    .|. otherExecuteMode\n\nlogsDirectory :: MonadIO m => ExceptT Text m FilePath\nlogsDirectory = do\n  dir <-\n    noteT \"Could not get environment variable LOGS_DIRECTORY\" $\n      MaybeT $\n        liftIO $\n          getEnv \"LOGS_DIRECTORY\"\n  dirExists <- liftIO $ doesDirectoryExist dir\n  tryAssert (\"LOGS_DIRECTORY \" <> T.pack dir <> \" does not exist.\") dirExists\n  unless\n    dirExists\n    ( liftIO $\n        putStrLn \"creating xdgRuntimeDir\" >> createDirectory dir regDirMode\n    )\n  return dir\n\ncacheDir :: MonadIO m => m FilePath\ncacheDir = do\n  cacheDirectory <- liftIO $ lookupEnv \"CACHE_DIRECTORY\"\n  xdgCacheHome <- liftIO $ fmap (fmap (\\dir -> dir </> \"nixpkgs-update\")) $ lookupEnv \"XDG_CACHE_HOME\"\n  cacheHome <- liftIO $ fmap (fmap (\\dir -> dir </> \".cache/nixpkgs-update\")) $ lookupEnv \"HOME\"\n  let dir = fromJust (cacheDirectory <|> xdgCacheHome <|> cacheHome)\n  liftIO $ createDirectoryIfMissing True dir\n  return dir\n\noutpathCacheDir :: MonadIO m => m FilePath\noutpathCacheDir = do\n  cache <- cacheDir\n  let dir = cache </> \"outpath\"\n  liftIO $ createDirectoryIfMissing False dir\n  return dir\n\nworktreeDir :: IO FilePath\nworktreeDir = do\n  cache <- cacheDir\n  let dir = cache </> \"worktree\"\n  createDirectoryIfMissing False dir\n  return dir\n\nxdgRuntimeDir :: MonadIO m => ExceptT Text m FilePath\nxdgRuntimeDir = do\n  xDir <-\n    noteT \"Could not get environment variable XDG_RUNTIME_DIR\" $\n      MaybeT $\n        liftIO $\n          getEnv \"XDG_RUNTIME_DIR\"\n  xDirExists <- liftIO $ doesDirectoryExist xDir\n  tryAssert (\"XDG_RUNTIME_DIR \" <> T.pack xDir <> \" does not exist.\") xDirExists\n  let dir = xDir </> \"nixpkgs-update\"\n  dirExists <- liftIO $ fileExist dir\n  unless\n    dirExists\n    ( liftIO $\n        putStrLn \"creating xdgRuntimeDir\" >> createDirectory dir regDirMode\n    )\n  return dir\n\ntmpRuntimeDir :: MonadIO m => ExceptT Text m FilePath\ntmpRuntimeDir = do\n  dir <- liftIO $ mkdtemp \"nixpkgs-update\"\n  dirExists <- liftIO $ doesDirectoryExist dir\n  tryAssert\n    (\"Temporary directory \" <> T.pack dir <> \" does not exist.\")\n    dirExists\n  return dir\n\nlogDir :: IO FilePath\nlogDir = do\n  r <-\n    runExceptT\n      ( logsDirectory\n          <|> xdgRuntimeDir\n          <|> tmpRuntimeDir\n          <|> throwE\n            \"Failed to create log directory.\"\n      )\n  case r of\n    Right dir -> return dir\n    Left e -> error $ T.unpack e\n\nbranchPrefix :: Text\nbranchPrefix = \"auto-update/\"\n\nbranchName :: UpdateEnv -> Text\nbranchName ue = branchPrefix <> packageName ue\n\nparseUpdates :: Text -> [Either Text (Text, Version, Version, Maybe URL)]\nparseUpdates = map (toTriple . T.words) . T.lines\n  where\n    toTriple :: [Text] -> Either Text (Text, Version, Version, Maybe URL)\n    toTriple [package, oldVer, newVer] = Right (package, oldVer, newVer, Nothing)\n    toTriple [package, oldVer, newVer, url] = Right (package, oldVer, newVer, Just url)\n    toTriple line = Left $ \"Unable to parse update: \" <> T.unwords line\n\nsrcOrMain :: MonadIO m => (Text -> ExceptT Text m a) -> Text -> ExceptT Text m a\nsrcOrMain et attrPath = et (attrPath <> \".src\") <|> et (attrPath <> \".originalSrc\") <|> et attrPath\n\nnixCommonOptions :: [String]\nnixCommonOptions =\n  [ \"--arg\",\n    \"config\",\n    \"{ allowUnfree = true; allowAliases = false; }\",\n    \"--arg\",\n    \"overlays\",\n    \"[ ]\"\n  ]\n\nnixBuildOptions :: [String]\nnixBuildOptions =\n  [ \"--option\",\n    \"sandbox\",\n    \"true\"\n  ]\n    <> nixCommonOptions\n\nrunLog ::\n  Member (Embed IO) r =>\n  (Text -> IO ()) ->\n  Sem ((Output Text) ': r) a ->\n  Sem r a\nrunLog logger =\n  interpret \\case\n    Output o -> embed $ logger o\n\nenvToken :: IO (Maybe Text)\nenvToken = fmap tshow <$> getEnv \"GITHUB_TOKEN\"\n\nlocalToken :: IO (Maybe Text)\nlocalToken = do\n  exists <- fileExist \"github_token.txt\"\n  if exists\n    then (Just . T.strip <$> T.readFile \"github_token.txt\")\n    else (return Nothing)\n\nhubFileLocation :: IO (Maybe FilePath)\nhubFileLocation = do\n  xloc <- fmap (</> \"hub\") <$> getEnv \"XDG_CONFIG_HOME\"\n  hloc <- fmap (</> \".config/hub\") <$> getEnv \"HOME\"\n  return (xloc <|> hloc)\n\nhubConfigField :: Text -> IO (Maybe Text)\nhubConfigField field = do\n  hubFile <- hubFileLocation\n  case hubFile of\n    Nothing -> return Nothing\n    Just file -> do\n      exists <- fileExist file\n      if not exists\n        then return Nothing\n        else do\n          contents <- T.readFile file\n          let splits = T.splitOn field contents\n              token = T.takeWhile (/= '\\n') $ head (drop 1 splits)\n          return $ Just token\n\ngetGithubToken :: IO (Maybe Text)\ngetGithubToken = do\n  et <- envToken\n  lt <- localToken\n  ht <- hubConfigField \"oauth_token: \"\n  return (et <|> lt <|> ht)\n\ngetGithubUser :: IO (GH.Name GH.Owner)\ngetGithubUser = do\n  hubUser <- hubConfigField \"user: \"\n  case hubUser of\n    Just usr -> return $ GH.mkOwnerName usr\n    Nothing -> return $ GH.mkOwnerName \"r-ryantm\"\n"
  },
  {
    "path": "src/Version.hs",
    "content": "{-# LANGUAGE FlexibleInstances #-}\n{-# LANGUAGE NamedFieldPuns #-}\n{-# LANGUAGE OverloadedStrings #-}\n\nmodule Version\n  ( assertCompatibleWithPathPin,\n    matchVersion,\n  )\nwhere\n\nimport Data.Char (isAlpha, isDigit)\nimport Data.Foldable (toList)\nimport Data.Function (on)\nimport qualified Data.PartialOrd as PO\nimport qualified Data.Text as T\nimport Data.Versions (SemVer (..), VUnit (..), semver)\nimport OurPrelude\nimport Utils\n\nnotElemOf :: (Eq a, Foldable t) => t a -> a -> Bool\nnotElemOf o = not . flip elem o\n\n-- | Similar to @breakOn@, but will not keep the pattern at the beginning of the suffix.\n--\n-- Examples:\n--\n-- >>> clearBreakOn \"::\" \"a::b::c\"\n-- (\"a\",\"b::c\")\nclearBreakOn :: Text -> Text -> (Text, Text)\nclearBreakOn boundary string =\n  let (prefix, suffix) = T.breakOn boundary string\n   in if T.null suffix\n        then (prefix, suffix)\n        else (prefix, T.drop (T.length boundary) suffix)\n\n-- | Check if attribute path is not pinned to a certain version.\n-- If a derivation is expected to stay at certain version branch,\n-- it will usually have the branch as a part of the attribute path.\n--\n-- Examples:\n--\n-- >>> versionCompatibleWithPathPin \"libgit2_0_25\" \"0.25.3\"\n-- True\n--\n-- >>> versionCompatibleWithPathPin \"owncloud90\" \"9.0.3\"\n-- True\n--\n-- >>> versionCompatibleWithPathPin \"owncloud-client\" \"2.4.1\"\n-- True\n--\n-- >>> versionCompatibleWithPathPin \"owncloud90\" \"9.1.3\"\n-- False\n--\n-- >>> versionCompatibleWithPathPin \"nodejs-slim-10_x\" \"11.2.0\"\n-- False\n--\n-- >>> versionCompatibleWithPathPin \"nodejs-slim-10_x\" \"10.12.0\"\n-- True\n--\n-- >>> versionCompatibleWithPathPin \"firefox-esr-78-unwrapped\" \"91.1.0esr\"\n-- False\nversionCompatibleWithPathPin :: Text -> Version -> Bool\nversionCompatibleWithPathPin attrPath newVer\n  | \"-unwrapped\" `T.isSuffixOf` attrPath =\n      versionCompatibleWithPathPin (T.dropEnd 10 attrPath) newVer\n  | \"_x\" `T.isSuffixOf` T.toLower attrPath =\n      versionCompatibleWithPathPin (T.dropEnd 2 attrPath) newVer\n  | \"_\" `T.isInfixOf` attrPath =\n      let attrVersionPart =\n            let (_, version) = clearBreakOn \"_\" attrPath\n             in if T.any (notElemOf ('_' : ['0' .. '9'])) version\n                  then Nothing\n                  else Just version\n          -- Check assuming version part has underscore separators\n          attrVersionPeriods = T.replace \"_\" \".\" <$> attrVersionPart\n       in -- If we don't find version numbers in the attr path, exit success.\n          maybe True (`T.isPrefixOf` newVer) attrVersionPeriods\n  | otherwise =\n      let attrVersionPart =\n            let version = T.dropWhile (notElemOf ['0' .. '9']) attrPath\n             in if T.any (notElemOf ['0' .. '9']) version\n                  then Nothing\n                  else Just version\n          -- Check assuming version part is the prefix of the version with dots\n          -- removed. For example, 91 => \"9.1\"\n          noPeriodNewVersion = T.replace \".\" \"\" newVer\n       in -- If we don't find version numbers in the attr path, exit success.\n          maybe True (`T.isPrefixOf` noPeriodNewVersion) attrVersionPart\n\nversionIncompatibleWithPathPin :: Text -> Version -> Bool\nversionIncompatibleWithPathPin path version =\n  not (versionCompatibleWithPathPin path version)\n\nassertCompatibleWithPathPin :: Monad m => UpdateEnv -> Text -> ExceptT Text m ()\nassertCompatibleWithPathPin ue attrPath =\n  tryAssert\n    ( \"Version in attr path \"\n        <> attrPath\n        <> \" not compatible with \"\n        <> newVersion ue\n    )\n    ( not\n        ( versionCompatibleWithPathPin attrPath (oldVersion ue)\n            && versionIncompatibleWithPathPin attrPath (newVersion ue)\n        )\n    )\n\ndata VersionPart\n  = PreReleasePart VersionPart\n  | EmptyPart\n  | IntPart Word\n  | TextPart Text\n  deriving (Show, Eq)\n\ndata ParsedVersion\n  = SemanticVersion SemVer\n  | SimpleVersion [VersionPart]\n  deriving (Show, Eq)\n\npreReleaseTexts :: [Text]\npreReleaseTexts = [\"alpha\", \"beta\", \"pre\", \"rc\"]\n\ntextPart :: Text -> VersionPart\ntextPart t\n  | tLower `elem` preReleaseTexts = PreReleasePart $ TextPart tLower\n  | otherwise = TextPart tLower\n  where\n    tLower = T.toLower t\n\nclass SimpleVersion a where\n  simpleVersion :: a -> [VersionPart]\n\ninstance SimpleVersion Text where\n  simpleVersion t\n    | digitHead /= \"\" = IntPart number : simpleVersion digitTail\n    | alphaHead /= \"\" = textPart alphaHead : simpleVersion alphaTail\n    | otherwise = []\n    where\n      t' = T.dropWhile (\\c -> not (isAlpha c || isDigit c)) t\n      (digitHead, digitTail) = T.span isDigit t'\n      number = read $ T.unpack digitHead\n      (alphaHead, alphaTail) = T.span isAlpha t'\n\ninstance SimpleVersion ParsedVersion where\n  simpleVersion (SimpleVersion v) = v\n  simpleVersion (SemanticVersion v) = simpleVersion v\n\ninstance SimpleVersion SemVer where\n  simpleVersion SemVer {_svMajor, _svMinor, _svPatch, _svPreRel} =\n    [IntPart _svMajor, IntPart _svMinor, IntPart _svPatch]\n      ++ map toPart (concat (fmap toList _svPreRel))\n    where\n      toPart :: VUnit -> VersionPart\n      toPart (Digits i) = IntPart i\n      toPart (Str t) =\n        case textPart t of\n          PreReleasePart p -> PreReleasePart p\n          p -> PreReleasePart p\n\ninstance SimpleVersion [VersionPart] where\n  simpleVersion = id\n\n-- | Pre-release parts come before empty parts, everything else comes after\n-- them. Int and text parts compare to themselves as expected and comparison\n-- between them is not defined.\ninstance PO.PartialOrd VersionPart where\n  PreReleasePart a <= PreReleasePart b = a PO.<= b\n  PreReleasePart _ <= _ = True\n  _ <= PreReleasePart _ = False\n  EmptyPart <= _ = True\n  _ <= EmptyPart = False\n  IntPart a <= IntPart b = a <= b\n  TextPart a <= TextPart b = a <= b\n  _ <= _ = False\n\n-- | If either version contains no comparable parts, the versions are not\n-- comparable. If both contain at least some parts, compare parts in order. When\n-- a version runs out of parts, its remaining parts are considered empty parts,\n-- which come after pre-release parts, but before other parts.\n--\n-- Examples:\n--\n-- >>> on PO.compare parseVersion \"1.2.3\" \"1.2.4\"\n-- Just LT\n--\n-- >>> on PO.compare parseVersion \"1.0\" \"-\"\n-- Nothing\n--\n-- >>> on PO.compare parseVersion \"-\" \"-\"\n-- Nothing\n--\n-- >>> on PO.compare parseVersion \"1.0\" \"1_0_0\"\n-- Just LT\n--\n-- >>> on PO.compare parseVersion \"1.0-pre3\" \"1.0\"\n-- Just LT\n--\n-- >>> on PO.compare parseVersion \"1.1\" \"1.a\"\n-- Nothing\ninstance PO.PartialOrd ParsedVersion where\n  SemanticVersion a <= SemanticVersion b = a <= b\n  SimpleVersion [] <= _ = False\n  _ <= SimpleVersion [] = False\n  a <= b = on lessOrEq simpleVersion a b\n    where\n      lessOrEq [] [] = True\n      lessOrEq [] ys = lessOrEq [EmptyPart] ys\n      lessOrEq xs [] = lessOrEq xs [EmptyPart]\n      lessOrEq (x : xs) (y : ys) =\n        case PO.compare x y of\n          Just LT -> True\n          Just EQ -> lessOrEq xs ys\n          Just GT -> False\n          Nothing -> False\n\nparseVersion :: Version -> ParsedVersion\nparseVersion v =\n  case semver v of\n    Left _ -> SimpleVersion $ simpleVersion v\n    Right v' -> SemanticVersion v'\n\nmatchUpperBound :: Boundary Version -> Version -> Bool\nmatchUpperBound Unbounded _ = True\nmatchUpperBound (Including b) v = parseVersion v PO.<= parseVersion b\nmatchUpperBound (Excluding b) v = parseVersion v PO.< parseVersion b\n\nmatchLowerBound :: Boundary Version -> Version -> Bool\nmatchLowerBound Unbounded _ = True\nmatchLowerBound (Including b) v = parseVersion b PO.<= parseVersion v\nmatchLowerBound (Excluding b) v = parseVersion b PO.< parseVersion v\n\n-- | Reports True only if matcher certainly matches. When the order or equality\n-- of versions is ambiguous, return False.\n--\n-- Examples:\n--\n-- >>> matchVersion (SingleMatcher \"1.2.3\") \"1_2-3\"\n-- True\n--\n-- >>> matchVersion (RangeMatcher Unbounded (Including \"1.0-pre3\")) \"1.0\"\n-- False\n--\n-- >>> matchVersion (RangeMatcher Unbounded (Excluding \"1.0-rev3\")) \"1.0\"\n-- True\nmatchVersion :: VersionMatcher -> Version -> Bool\nmatchVersion (SingleMatcher v) v' = parseVersion v PO.== parseVersion v'\nmatchVersion (RangeMatcher lowerBound upperBound) v =\n  matchLowerBound lowerBound v && matchUpperBound upperBound v\n"
  },
  {
    "path": "test/CheckSpec.hs",
    "content": "module CheckSpec where\n\nimport qualified Check\nimport qualified Data.Text as T\nimport Test.Hspec\n\nmain :: IO ()\nmain = hspec spec\n\nspec :: Spec\nspec = do\n  describe \"version check\" do\n    let seaweedVersion234 = T.pack \"version 30GB 2.34 linux amd64\"\n    it \"finds the version when present\" do\n      Check.hasVersion (T.pack \"The version is 2.34\") (T.pack \"2.34\") `shouldBe` True\n      Check.hasVersion (T.pack \"The version is 2.34.\") (T.pack \"2.34\") `shouldBe` True\n      Check.hasVersion (T.pack \"2.34 is the version\") (T.pack \"2.34\") `shouldBe` True\n      Check.hasVersion seaweedVersion234 (T.pack \"2.34\") `shouldBe` True\n\n    it \"doesn't produce false positives\" do\n      Check.hasVersion (T.pack \"The version is 12.34\") (T.pack \"2.34\") `shouldBe` False\n      Check.hasVersion (T.pack \"The version is 2.345\") (T.pack \"2.34\") `shouldBe` False\n      Check.hasVersion (T.pack \"The version is 2.35\") (T.pack \"2.34\") `shouldBe` False\n      Check.hasVersion (T.pack \"2.35 is the version\") (T.pack \"2.34\") `shouldBe` False\n      Check.hasVersion (T.pack \"2.345 is the version\") (T.pack \"2.34\") `shouldBe` False\n      Check.hasVersion (T.pack \"12.34 is the version\") (T.pack \"2.34\") `shouldBe` False\n      Check.hasVersion seaweedVersion234 (T.pack \"2.35\") `shouldBe` False\n\n    it \"negative lookahead construction\" do\n      Check.versionWithoutPath \"/nix/store/z9l2xakz7cgw6yfh83nh542pvc0g4rkq-geeqie-2.0.1\" (T.pack \"2.0.1\") `shouldBe` \"(?<!\\\\Qz9l2xakz7cgw6yfh83nh542pvc0g4rkq-geeqie-\\\\E)\\\\Q2.0.1\\\\E\"\n      Check.versionWithoutPath \"/nix/store/z9l2xakz7cgw6yfh83nh542pvc0g4rkq-abc\" (T.pack \"2.0.1\") `shouldBe` \"\\\\Q2.0.1\\\\E\"\n      Check.versionWithoutPath \"/nix/store/z9l2xakz7cgw6yfh83nh542pvc0g4rkq-2.0.1-dev\" (T.pack \"2.0.1\") `shouldBe` \"(?<!\\\\Qz9l2xakz7cgw6yfh83nh542pvc0g4rkq-\\\\E)\\\\Q2.0.1\\\\E\"\n"
  },
  {
    "path": "test/DoctestSpec.hs",
    "content": "module DoctestSpec where\n\nimport Test.DocTest\nimport Test.Hspec\n\nmain :: IO ()\nmain = hspec spec\n\nspec :: Spec\nspec = do\n  describe \"Doctests\" do\n    it \"should all pass\" do\n      doctest\n        [ \"-isrc\",\n          \"-XOverloadedStrings\",\n          \"-XDataKinds\",\n          \"-XFlexibleContexts\",\n          \"-XGADTs\",\n          \"-XLambdaCase\",\n          \"-XPolyKinds\",\n          \"-XRankNTypes\",\n          \"-XScopedTypeVariables\",\n          \"-XTypeApplications\",\n          \"-XTypeFamilies\",\n          \"-XTypeOperators\",\n          \"-XBlockArguments\",\n          \"-flate-specialise\",\n          \"-fspecialise-aggressively\",\n          \"-fplugin=Polysemy.Plugin\",\n          \"src/Version.hs\",\n          \"src/GH.hs\"\n        ]\n"
  },
  {
    "path": "test/Spec.hs",
    "content": "{-# OPTIONS_GHC -F -pgmF hspec-discover #-}\n"
  },
  {
    "path": "test/UpdateSpec.hs",
    "content": "{-# LANGUAGE OverloadedStrings #-}\n\nmodule UpdateSpec where\n\nimport qualified Data.Text.IO as T\nimport Test.Hspec\nimport qualified Update\nimport qualified Utils\n\nmain :: IO ()\nmain = hspec spec\n\n-- Tests here will write their actual results to an actual*.md file next to the\n-- expected one. If you are updating the PR description, run the test suite and\n-- copy all the actuals over to expected. This can be viewed with `git diff`\n-- then to ensure your changes look as expected.\nspec :: Spec\nspec = do\n  describe \"PR message\" do\n    -- Common mock options\n    let options = Utils.Options False False \"r-ryantm\" \"\" False False False False\n    let updateEnv = Utils.UpdateEnv \"foobar\" \"1.0\" \"1.1\" (Just \"https://update-site.com\") options\n    let metaDescription = \"\\\"Foobar package description\\\"\"\n    let metaHomepage = \"\\\"https://foobar-homepage.com\\\"\"\n    let metaChangelog = \"\\\"https://foobar-homepage.com/changelog/v1.2.3\\\"\"\n    let rewriteMsgs = [\"Version Update\", \"Other Update\"]\n    let releaseUrl = \"https://github.com/foobar/releases\"\n    let compareUrl = \"https://github.com/foobar/compare\"\n    let resultCheckReport = \"- Some other check\"\n    let commitHash = \"af39cf77a0d42a4f6771043ec54221ed\"\n    let attrPath = \"foobar\"\n    let maintainersCc = \"@maintainer1\"\n    let resultPath = \"/nix/store/some-hash-path\"\n    let opReport = \"123 total rebuild path(s)\"\n    let cveRep = \"\"\n    let cacheTestInstructions = \"\"\n    let nixpkgsReviewMsg = \"nixpkgs-review comment body\"\n\n    it \"matches a simple mock example\" do\n      expected <- T.readFile \"test_data/expected_pr_description_1.md\"\n      let actual = Update.prMessage updateEnv metaDescription metaHomepage metaChangelog rewriteMsgs releaseUrl compareUrl resultCheckReport commitHash attrPath maintainersCc resultPath opReport cveRep cacheTestInstructions nixpkgsReviewMsg\n      T.writeFile \"test_data/actual_pr_description_1.md\" actual\n      actual `shouldBe` expected\n\n    it \"does not include Nixpkgs review section when no review was done\" do\n      expected <- T.readFile \"test_data/expected_pr_description_2.md\"\n      let nixpkgsReviewMsg' = \"\"\n      let actual = Update.prMessage updateEnv metaDescription metaHomepage metaChangelog rewriteMsgs releaseUrl compareUrl resultCheckReport commitHash attrPath maintainersCc resultPath opReport cveRep cacheTestInstructions nixpkgsReviewMsg'\n      T.writeFile \"test_data/actual_pr_description_2.md\" actual\n      actual `shouldBe` expected\n"
  },
  {
    "path": "test/UtilsSpec.hs",
    "content": "{-# LANGUAGE OverloadedStrings #-}\n\nmodule UtilsSpec where\n\nimport Test.Hspec\nimport qualified Utils\n\nmain :: IO ()\nmain = hspec spec\n\nspec :: Spec\nspec = do\n  let options = Utils.Options False False \"\" \"\" False False False False\n  let updateEnv = Utils.UpdateEnv \"foobar\" \"1.0\" \"1.1\" (Just \"https://update-site.com\") options\n  describe \"PR title\" do\n    -- This breaks IRC when it tries to link to newly opened pull requests\n    it \"should not include a trailing newline\" do\n      let title = Utils.prTitle updateEnv \"python37Packages.foobar\"\n      title `shouldBe` \"python37Packages.foobar: 1.0 -> 1.1\"\n\n  describe \"titleVersion\" do\n    it \"should parse prTitle output\" do\n      let title = Utils.prTitle updateEnv \"python37Packages.foobar\"\n      let version = Utils.titleVersion title\n      version `shouldBe` Just \"1.1\"\n\n    it \"should fail on unexpected commit messages\" do\n      let version = Utils.titleVersion \"not a prTitle-style commit message\"\n      version `shouldBe` Nothing\n"
  },
  {
    "path": "test_data/expected_pr_description_1.md",
    "content": "Semi-automatic update generated by [nixpkgs-update](https://github.com/nix-community/nixpkgs-update) tools. This update was made based on information from https://update-site.com.\n\nmeta.description for foobar is: \"Foobar package description\"\n\nmeta.homepage for foobar is: \"https://foobar-homepage.com\"\n\nmeta.changelog for foobar is: \"https://foobar-homepage.com/changelog/v1.2.3\"\n\n\n###### Updates performed\n- Version Update\n- Other Update\n\n###### To inspect upstream changes\n\n- [Release on GitHub](https://github.com/foobar/releases)\n\n- [Compare changes on GitHub](https://github.com/foobar/compare)\n\n###### Impact\n\n<b>Checks done</b>\n\n---\n\n- built on NixOS\n- Some other check\n\n---\n\n<details>\n<summary>\n<b>Rebuild report</b> (if merged into master) (click to expand)\n</summary>\n\n```\n123 total rebuild path(s)\n```\n\n</details>\n\n<details>\n<summary>\n<b>Instructions to test this update</b> (click to expand)\n</summary>\n\n---\n\n\n```\nnix-build -A foobar https://github.com/r-ryantm/nixpkgs/archive/af39cf77a0d42a4f6771043ec54221ed.tar.gz\n```\nOr:\n```\nnix build github:r-ryantm/nixpkgs/af39cf77a0d42a4f6771043ec54221ed#foobar\n```\n\nAfter you've downloaded or built it, look at the files and if there are any, run the binaries:\n```\nls -la /nix/store/some-hash-path\nls -la /nix/store/some-hash-path/bin\n```\n\n---\n\n</details>\n<br/>\n\n\n\n### Pre-merge build results\n\nWe have automatically built all packages that will get rebuilt due to\nthis change.\n\nThis gives evidence on whether the upgrade will break dependent packages.\nNote sometimes packages show up as _failed to build_ independent of the\nchange, simply because they are already broken on the target branch.\n\nnixpkgs-review comment body\n\n---\n\n###### Maintainer pings\n\ncc @maintainer1 for [testing](https://github.com/nix-community/nixpkgs-update/blob/main/doc/nixpkgs-maintainer-faq.md#r-ryantm-opened-a-pr-for-my-package-what-do-i-do).\n\n> [!TIP]\n> As a maintainer, if your package is located under `pkgs/by-name/*`, you can comment **`@NixOS/nixpkgs-merge-bot merge`** to automatically merge this update using the [`nixpkgs-merge-bot`](https://github.com/NixOS/nixpkgs/blob/master/ci/README.md#nixpkgs-merge-bot)."
  },
  {
    "path": "test_data/expected_pr_description_2.md",
    "content": "Semi-automatic update generated by [nixpkgs-update](https://github.com/nix-community/nixpkgs-update) tools. This update was made based on information from https://update-site.com.\n\nmeta.description for foobar is: \"Foobar package description\"\n\nmeta.homepage for foobar is: \"https://foobar-homepage.com\"\n\nmeta.changelog for foobar is: \"https://foobar-homepage.com/changelog/v1.2.3\"\n\n\n###### Updates performed\n- Version Update\n- Other Update\n\n###### To inspect upstream changes\n\n- [Release on GitHub](https://github.com/foobar/releases)\n\n- [Compare changes on GitHub](https://github.com/foobar/compare)\n\n###### Impact\n\n<b>Checks done</b>\n\n---\n\n- built on NixOS\n- Some other check\n\n---\n\n<details>\n<summary>\n<b>Rebuild report</b> (if merged into master) (click to expand)\n</summary>\n\n```\n123 total rebuild path(s)\n```\n\n</details>\n\n<details>\n<summary>\n<b>Instructions to test this update</b> (click to expand)\n</summary>\n\n---\n\n\n```\nnix-build -A foobar https://github.com/r-ryantm/nixpkgs/archive/af39cf77a0d42a4f6771043ec54221ed.tar.gz\n```\nOr:\n```\nnix build github:r-ryantm/nixpkgs/af39cf77a0d42a4f6771043ec54221ed#foobar\n```\n\nAfter you've downloaded or built it, look at the files and if there are any, run the binaries:\n```\nls -la /nix/store/some-hash-path\nls -la /nix/store/some-hash-path/bin\n```\n\n---\n\n</details>\n<br/>\n\n\n\n### Pre-merge build results\n\nNixpkgs review skipped\n\n---\n\n###### Maintainer pings\n\ncc @maintainer1 for [testing](https://github.com/nix-community/nixpkgs-update/blob/main/doc/nixpkgs-maintainer-faq.md#r-ryantm-opened-a-pr-for-my-package-what-do-i-do).\n\n> [!TIP]\n> As a maintainer, if your package is located under `pkgs/by-name/*`, you can comment **`@NixOS/nixpkgs-merge-bot merge`** to automatically merge this update using the [`nixpkgs-merge-bot`](https://github.com/NixOS/nixpkgs/blob/master/ci/README.md#nixpkgs-merge-bot)."
  },
  {
    "path": "test_data/quoted_homepage_bad.nix",
    "content": "{ stdenv, fetchFromGitHub, autoreconfHook, pkgconfig\n, gnutls, libite, libconfuse }:\n\nstdenv.mkDerivation rec {\n  pname = \"inadyn\";\n  version = \"2.6\";\n\n  src = fetchFromGitHub {\n    owner = \"troglobit\";\n    repo = \"inadyn\";\n    rev = \"v${version}\";\n    sha256 = \"013kxlglxliajv3lrsix4w88w40g709rvycajb6ad6gbh8giqv47\";\n  };\n\n  nativeBuildInputs = [ autoreconfHook pkgconfig ];\n\n  buildInputs = [ gnutls libite libconfuse ];\n\n  enableParallelBuilding = true;\n\n  meta = with stdenv.lib; {\n    homepage = http://troglobit.com/project/inadyn/;\n    description = \"Free dynamic DNS client\";\n    license = licenses.gpl2Plus;\n    maintainers = with maintainers; [ ];\n    platforms = platforms.linux;\n  };\n}\n"
  },
  {
    "path": "test_data/quoted_homepage_good.nix",
    "content": "{ stdenv, fetchFromGitHub, autoreconfHook, pkgconfig\n, gnutls, libite, libconfuse }:\n\nstdenv.mkDerivation rec {\n  pname = \"inadyn\";\n  version = \"2.6\";\n\n  src = fetchFromGitHub {\n    owner = \"troglobit\";\n    repo = \"inadyn\";\n    rev = \"v${version}\";\n    sha256 = \"013kxlglxliajv3lrsix4w88w40g709rvycajb6ad6gbh8giqv47\";\n  };\n\n  nativeBuildInputs = [ autoreconfHook pkgconfig ];\n\n  buildInputs = [ gnutls libite libconfuse ];\n\n  enableParallelBuilding = true;\n\n  meta = with stdenv.lib; {\n    homepage = \"http://troglobit.com/project/inadyn/\";\n    description = \"Free dynamic DNS client\";\n    license = licenses.gpl2Plus;\n    maintainers = with maintainers; [ ];\n    platforms = platforms.linux;\n  };\n}\n"
  }
]