Full Code of ryantm/nixpkgs-update for AI

main 09611a165cb3 cached
71 files
207.5 KB
58.4k tokens
14 symbols
1 requests
Download .txt
Showing preview only (224K chars total). Download the full file or copy to clipboard to get everything.
Repository: ryantm/nixpkgs-update
Branch: main
Commit: 09611a165cb3
Files: 71
Total size: 207.5 KB

Directory structure:
gitextract_rca3ac9l/

├── .github/
│   ├── CONTRIBUTING.md
│   ├── FUNDING.yml
│   ├── dependabot.yml
│   └── workflows/
│       ├── doc.yaml
│       └── flake-updates.yml
├── .gitignore
├── CVENOTES.org
├── LICENSE
├── README.md
├── app/
│   └── Main.hs
├── doc/
│   ├── batch-updates.md
│   ├── contact.md
│   ├── contributing.md
│   ├── details.md
│   ├── donate.md
│   ├── installation.md
│   ├── interactive-updates.md
│   ├── introduction.md
│   ├── nixpkgs-maintainer-faq.md
│   ├── nixpkgs-update.md
│   ├── nu.md
│   ├── r-ryantm.md
│   └── toc.md
├── flake.nix
├── nixpkgs-update.cabal
├── nixpkgs-update.nix
├── package.yaml
├── pkgs/
│   └── default.nix
├── rust/
│   ├── .envrc
│   ├── .gitignore
│   ├── Cargo.toml
│   ├── diesel.toml
│   ├── flake.nix
│   ├── migrations/
│   │   └── 2023-08-12-152848_create_packages/
│   │       ├── down.sql
│   │       └── up.sql
│   └── src/
│       ├── github.rs
│       ├── lib.rs
│       ├── main.rs
│       ├── models.rs
│       ├── nix.rs
│       ├── repology.rs
│       └── schema.rs
├── src/
│   ├── CVE.hs
│   ├── Check.hs
│   ├── Data/
│   │   └── Hex.hs
│   ├── DeleteMerged.hs
│   ├── File.hs
│   ├── GH.hs
│   ├── Git.hs
│   ├── NVD.hs
│   ├── NVDRules.hs
│   ├── Nix.hs
│   ├── NixpkgsReview.hs
│   ├── OurPrelude.hs
│   ├── Outpaths.hs
│   ├── Process.hs
│   ├── Repology.hs
│   ├── Rewrite.hs
│   ├── Skiplist.hs
│   ├── Update.hs
│   ├── Utils.hs
│   └── Version.hs
├── test/
│   ├── CheckSpec.hs
│   ├── DoctestSpec.hs
│   ├── Spec.hs
│   ├── UpdateSpec.hs
│   └── UtilsSpec.hs
└── test_data/
    ├── expected_pr_description_1.md
    ├── expected_pr_description_2.md
    ├── quoted_homepage_bad.nix
    └── quoted_homepage_good.nix

================================================
FILE CONTENTS
================================================

================================================
FILE: .github/CONTRIBUTING.md
================================================
Thank you for your interest in contributing to nixpkgs-update.

# Licensing

Please indicate you license your contributions by commenting:

```
I hereby license my contributions to this repository under:
Creative Commons Zero v1.0 Universal (SPDX Short Identifier: CC0-1.0)
```

in this [Pull Request thread](https://github.com/nix-community/nixpkgs-update/pull/116).


================================================
FILE: .github/FUNDING.yml
================================================
github: ryantm
patreon: nixpkgsupdate


================================================
FILE: .github/dependabot.yml
================================================
version: 2
updates:
  - package-ecosystem: "github-actions"
    directory: "/"
    schedule:
      interval: "weekly"


================================================
FILE: .github/workflows/doc.yaml
================================================
name: doc

on:
  push:
    branches:
      - main
  workflow_dispatch:

permissions:
  contents: read
  pages: write
  id-token: write

concurrency:
  group: "doc"
  cancel-in-progress: false

jobs:
  publish:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v6
      - uses: cachix/install-nix-action@v31
        with:
          extra_nix_config: |
            accept-flake-config = true
            experimental-features = nix-command flakes
      - name: Setup Pages
        id: pages
        uses: actions/configure-pages@v6
      - name: Build Pages
        run: |
          nix build .#nixpkgs-update-doc
      - name: Upload artifact
        uses: actions/upload-pages-artifact@v5
        with:
          path: ./result/multi

  deploy:
    environment:
      name: github-pages
      url: ${{ steps.deployment.outputs.page_url }}
    runs-on: ubuntu-latest
    needs: publish
    steps:
      - name: Deploy to GitHub Pages
        id: deployment
        uses: actions/deploy-pages@v5


================================================
FILE: .github/workflows/flake-updates.yml
================================================
name: "Update flakes"
on:
  workflow_dispatch:
  schedule:
    - cron: "0 0 1 * *"
jobs:
  createPullRequest:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v6
      - name: Install Nix
        uses: cachix/install-nix-action@v31
        with:
          extra_nix_config: |
            experimental-features = nix-command flakes
      - name: Update flake.lock
        uses: DeterminateSystems/update-flake-lock@v28


================================================
FILE: .gitignore
================================================
.ghc*
/github_token.txt
/packages-to-update.txt
/result
/result-doc
dist-newstyle/
dist/
test_data/actual*
.direnv


================================================
FILE: CVENOTES.org
================================================
* Issues
** https://github.com/NixOS/nixpkgs/pull/74184#issuecomment-565891652
* Fixed
** uzbl: 0.9.0 -> 0.9.1
  - [[https://nvd.nist.gov/vuln/detail/CVE-2010-0011][CVE-2010-0011]]
  - [[https://nvd.nist.gov/vuln/detail/CVE-2010-2809][CVE-2010-2809]]

  Both CVEs refer to matchers that are date based releases, but the
  author of the library switched to normal version numbering after
  that, so these CVEs are reported as relevant even though they are
  not.
** terraform: 0.12.7 -> 0.12.9
   - [[https://nvd.nist.gov/vuln/detail/CVE-2018-9057][CVE-2018-9057]]

   https://nvd.nist.gov/products/cpe/detail/492339?keyword=cpe:2.3:a:hashicorp:terraform:1.12.0:*:*:*:*:aws:*:*&status=FINAL,DEPRECATED&orderBy=CPEURI&namingFormat=2.3

   CVE only applies to terraform-providers-aws, but you can only tell that by looking at the "Target Software" part.
** tor: 0.4.1.5 -> 0.4.1.6
   https://nvd.nist.gov/vuln/detail/CVE-2017-16541

  the CPE mistakenly uses tor for the product id when the product id should be torbrowser
** arena: 1.1 -> 1.06
  - [[https://nvd.nist.gov/vuln/detail/CVE-2018-8843][CVE-2018-8843]]
  - [[https://nvd.nist.gov/vuln/detail/CVE-2019-15567][CVE-2019-15567]]

   Not rockwellautomation:arena
   Not openforis:arena
** thrift
   Apache Thrift vs Facebook Thrift
** go: 1.13.3 -> 1.13.4
   https://github.com/NixOS/nixpkgs/pull/72516

   Looks like maybe go used to use dates for versions and now uses
   regular versions
** kanboard: 1.2.11 -> 1.2.12
   https://github.com/NixOS/nixpkgs/pull/74429
   cve is about a kanboard plugin provided by jenkins not kanboard itself


================================================
FILE: LICENSE
================================================
Creative Commons Legal Code

CC0 1.0 Universal

    CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE
    LEGAL SERVICES. DISTRIBUTION OF THIS DOCUMENT DOES NOT CREATE AN
    ATTORNEY-CLIENT RELATIONSHIP. CREATIVE COMMONS PROVIDES THIS
    INFORMATION ON AN "AS-IS" BASIS. CREATIVE COMMONS MAKES NO WARRANTIES
    REGARDING THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS
    PROVIDED HEREUNDER, AND DISCLAIMS LIABILITY FOR DAMAGES RESULTING FROM
    THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS PROVIDED
    HEREUNDER.

Statement of Purpose

The laws of most jurisdictions throughout the world automatically confer
exclusive Copyright and Related Rights (defined below) upon the creator
and subsequent owner(s) (each and all, an "owner") of an original work of
authorship and/or a database (each, a "Work").

Certain owners wish to permanently relinquish those rights to a Work for
the purpose of contributing to a commons of creative, cultural and
scientific works ("Commons") that the public can reliably and without fear
of later claims of infringement build upon, modify, incorporate in other
works, reuse and redistribute as freely as possible in any form whatsoever
and for any purposes, including without limitation commercial purposes.
These owners may contribute to the Commons to promote the ideal of a free
culture and the further production of creative, cultural and scientific
works, or to gain reputation or greater distribution for their Work in
part through the use and efforts of others.

For these and/or other purposes and motivations, and without any
expectation of additional consideration or compensation, the person
associating CC0 with a Work (the "Affirmer"), to the extent that he or she
is an owner of Copyright and Related Rights in the Work, voluntarily
elects to apply CC0 to the Work and publicly distribute the Work under its
terms, with knowledge of his or her Copyright and Related Rights in the
Work and the meaning and intended legal effect of CC0 on those rights.

1. Copyright and Related Rights. A Work made available under CC0 may be
protected by copyright and related or neighboring rights ("Copyright and
Related Rights"). Copyright and Related Rights include, but are not
limited to, the following:

  i. the right to reproduce, adapt, distribute, perform, display,
     communicate, and translate a Work;
 ii. moral rights retained by the original author(s) and/or performer(s);
iii. publicity and privacy rights pertaining to a person's image or
     likeness depicted in a Work;
 iv. rights protecting against unfair competition in regards to a Work,
     subject to the limitations in paragraph 4(a), below;
  v. rights protecting the extraction, dissemination, use and reuse of data
     in a Work;
 vi. database rights (such as those arising under Directive 96/9/EC of the
     European Parliament and of the Council of 11 March 1996 on the legal
     protection of databases, and under any national implementation
     thereof, including any amended or successor version of such
     directive); and
vii. other similar, equivalent or corresponding rights throughout the
     world based on applicable law or treaty, and any national
     implementations thereof.

2. Waiver. To the greatest extent permitted by, but not in contravention
of, applicable law, Affirmer hereby overtly, fully, permanently,
irrevocably and unconditionally waives, abandons, and surrenders all of
Affirmer's Copyright and Related Rights and associated claims and causes
of action, whether now known or unknown (including existing as well as
future claims and causes of action), in the Work (i) in all territories
worldwide, (ii) for the maximum duration provided by applicable law or
treaty (including future time extensions), (iii) in any current or future
medium and for any number of copies, and (iv) for any purpose whatsoever,
including without limitation commercial, advertising or promotional
purposes (the "Waiver"). Affirmer makes the Waiver for the benefit of each
member of the public at large and to the detriment of Affirmer's heirs and
successors, fully intending that such Waiver shall not be subject to
revocation, rescission, cancellation, termination, or any other legal or
equitable action to disrupt the quiet enjoyment of the Work by the public
as contemplated by Affirmer's express Statement of Purpose.

3. Public License Fallback. Should any part of the Waiver for any reason
be judged legally invalid or ineffective under applicable law, then the
Waiver shall be preserved to the maximum extent permitted taking into
account Affirmer's express Statement of Purpose. In addition, to the
extent the Waiver is so judged Affirmer hereby grants to each affected
person a royalty-free, non transferable, non sublicensable, non exclusive,
irrevocable and unconditional license to exercise Affirmer's Copyright and
Related Rights in the Work (i) in all territories worldwide, (ii) for the
maximum duration provided by applicable law or treaty (including future
time extensions), (iii) in any current or future medium and for any number
of copies, and (iv) for any purpose whatsoever, including without
limitation commercial, advertising or promotional purposes (the
"License"). The License shall be deemed effective as of the date CC0 was
applied by Affirmer to the Work. Should any part of the License for any
reason be judged legally invalid or ineffective under applicable law, such
partial invalidity or ineffectiveness shall not invalidate the remainder
of the License, and in such case Affirmer hereby affirms that he or she
will not (i) exercise any of his or her remaining Copyright and Related
Rights in the Work or (ii) assert any associated claims and causes of
action with respect to the Work, in either case contrary to Affirmer's
express Statement of Purpose.

4. Limitations and Disclaimers.

 a. No trademark or patent rights held by Affirmer are waived, abandoned,
    surrendered, licensed or otherwise affected by this document.
 b. Affirmer offers the Work as-is and makes no representations or
    warranties of any kind concerning the Work, express, implied,
    statutory or otherwise, including without limitation warranties of
    title, merchantability, fitness for a particular purpose, non
    infringement, or the absence of latent or other defects, accuracy, or
    the present or absence of errors, whether or not discoverable, all to
    the greatest extent permissible under applicable law.
 c. Affirmer disclaims responsibility for clearing rights of other persons
    that may apply to the Work or any use thereof, including without
    limitation any person's Copyright and Related Rights in the Work.
    Further, Affirmer disclaims responsibility for obtaining any necessary
    consents, permissions or other rights required for any use of the
    Work.
 d. Affirmer understands and acknowledges that Creative Commons is not a
    party to this document and has no duty or obligation with respect to
    this CC0 or use of the Work.

================================================
FILE: README.md
================================================
# nixpkgs-update

[![Patreon](https://img.shields.io/badge/patreon-donate-blue.svg)](https://www.patreon.com/nixpkgsupdate)

Please read the [documentation](https://nix-community.github.io/nixpkgs-update/).


================================================
FILE: app/Main.hs
================================================
{-# LANGUAGE ExtendedDefaultRules #-}
{-# LANGUAGE NamedFieldPuns #-}
{-# LANGUAGE OverloadedStrings #-}
{-# OPTIONS_GHC -fno-warn-type-defaults #-}

module Main where

import Control.Applicative ((<**>))
import qualified Data.Text as T
import qualified Data.Text.IO as T
import DeleteMerged (deleteDone)
import Git
import qualified GitHub as GH
import NVD (withVulnDB)
import qualified Nix
import qualified Options.Applicative as O
import OurPrelude
import qualified Repology
import System.IO (BufferMode (..), hSetBuffering, stderr, stdout)
import qualified System.Posix.Env as P
import Update (cveAll, cveReport, sourceGithubAll, updatePackage)
import Utils (Options (..), UpdateEnv (..), getGithubToken, getGithubUser)

default (T.Text)

data UpdateOptions = UpdateOptions
  { pr :: Bool,
    cve :: Bool,
    nixpkgsReview :: Bool,
    outpaths :: Bool,
    attrpathOpt :: Bool
  }

data Command
  = Update UpdateOptions Text
  | UpdateBatch UpdateOptions Text
  | DeleteDone Bool
  | Version
  | UpdateVulnDB
  | CheckAllVulnerable
  | SourceGithub
  | FetchRepology
  | CheckVulnerable Text Text Text

updateOptionsParser :: O.Parser UpdateOptions
updateOptionsParser =
  UpdateOptions
    <$> O.flag False True (O.long "pr" <> O.help "Make a pull request using Hub.")
    <*> O.flag False True (O.long "cve" <> O.help "Make a CVE vulnerability report.")
    <*> O.flag False True (O.long "nixpkgs-review" <> O.help "Runs nixpkgs-review on update commit rev")
    <*> O.flag False True (O.long "outpaths" <> O.help "Calculate outpaths to determine the branch to target")
    <*> O.flag False True (O.long "attrpath" <> O.help "UPDATE_INFO uses the exact attrpath.")

updateParser :: O.Parser Command
updateParser =
  Update
    <$> updateOptionsParser
    <*> O.strArgument (O.metavar "UPDATE_INFO" <> O.help "update string of the form: 'pkg oldVer newVer update-page'\n\n example: 'tflint 0.15.0 0.15.1 repology.org'")

updateBatchParser :: O.Parser Command
updateBatchParser =
  UpdateBatch
    <$> updateOptionsParser
    <*> O.strArgument (O.metavar "UPDATE_INFO" <> O.help "update string of the form: 'pkg oldVer newVer update-page'\n\n example: 'tflint 0.15.0 0.15.1 repology.org'")

deleteDoneParser :: O.Parser Command
deleteDoneParser =
  DeleteDone
    <$> O.flag False True (O.long "delete" <> O.help "Actually delete the done branches. Otherwise just prints the branches to delete.")

commandParser :: O.Parser Command
commandParser =
  O.hsubparser
    ( O.command
        "update"
        (O.info (updateParser) (O.progDesc "Update one package"))
        <> O.command
          "update-batch"
          (O.info (updateBatchParser) (O.progDesc "Update one package in batch mode."))
        <> O.command
          "delete-done"
          ( O.info
              deleteDoneParser
              (O.progDesc "Deletes branches from PRs that were merged or closed")
          )
        <> O.command
          "version"
          ( O.info
              (pure Version)
              ( O.progDesc
                  "Displays version information for nixpkgs-update and dependencies"
              )
          )
        <> O.command
          "update-vulnerability-db"
          ( O.info
              (pure UpdateVulnDB)
              (O.progDesc "Updates the vulnerability database")
          )
        <> O.command
          "check-vulnerable"
          (O.info checkVulnerable (O.progDesc "checks if something is vulnerable"))
        <> O.command
          "check-all-vulnerable"
          ( O.info
              (pure CheckAllVulnerable)
              (O.progDesc "checks all packages to update for vulnerabilities")
          )
        <> O.command
          "source-github"
          (O.info (pure SourceGithub) (O.progDesc "looks for updates on GitHub"))
        <> O.command
          "fetch-repology"
          (O.info (pure FetchRepology) (O.progDesc "fetches update from Repology and prints them to stdout"))
    )

checkVulnerable :: O.Parser Command
checkVulnerable =
  CheckVulnerable
    <$> O.strArgument (O.metavar "PRODUCT_ID")
    <*> O.strArgument (O.metavar "OLD_VERSION")
    <*> O.strArgument (O.metavar "NEW_VERSION")

programInfo :: O.ParserInfo Command
programInfo =
  O.info
    (commandParser <**> O.helper)
    ( O.fullDesc
        <> O.progDesc "Update packages in the Nixpkgs repository"
        <> O.header "nixpkgs-update"
    )

main :: IO ()
main = do
  hSetBuffering stdout LineBuffering
  hSetBuffering stderr LineBuffering
  command <- O.execParser programInfo
  ghUser <- getGithubUser
  token <- fromMaybe "" <$> getGithubToken
  P.setEnv "GITHUB_TOKEN" (T.unpack token) True
  P.setEnv "GITHUB_API_TOKEN" (T.unpack token) True
  P.setEnv "PAGER" "" True
  case command of
    DeleteDone delete -> do
      setupNixpkgs $ GH.untagName ghUser
      deleteDone delete token ghUser
    Update UpdateOptions {pr, cve, nixpkgsReview, outpaths, attrpathOpt} update -> do
      setupNixpkgs $ GH.untagName ghUser
      updatePackage (Options pr False ghUser token cve nixpkgsReview outpaths attrpathOpt) update
    UpdateBatch UpdateOptions {pr, cve, nixpkgsReview, outpaths, attrpathOpt} update -> do
      setupNixpkgs $ GH.untagName ghUser
      updatePackage (Options pr True ghUser token cve nixpkgsReview outpaths attrpathOpt) update
    Version -> do
      v <- runExceptT Nix.version
      case v of
        Left t -> T.putStrLn ("error:" <> t)
        Right t -> T.putStrLn t
    UpdateVulnDB -> withVulnDB $ \_conn -> pure ()
    CheckAllVulnerable -> do
      setupNixpkgs $ GH.untagName ghUser
      updates <- T.readFile "packages-to-update.txt"
      cveAll undefined updates
    CheckVulnerable productID oldVersion newVersion -> do
      setupNixpkgs $ GH.untagName ghUser
      report <-
        cveReport
          (UpdateEnv productID oldVersion newVersion Nothing (Options False False ghUser token False False False False))
      T.putStrLn report
    SourceGithub -> do
      updates <- T.readFile "packages-to-update.txt"
      setupNixpkgs $ GH.untagName ghUser
      sourceGithubAll (Options False False ghUser token False False False False) updates
    FetchRepology -> Repology.fetch


================================================
FILE: doc/batch-updates.md
================================================
# Batch updates {#batch-updates}

nixpkgs-update supports batch updates via the `update-list`
subcommand.

## Update-List tutorial

1. Setup [hub](https://github.com/github/hub) and give it your GitHub
   credentials, so it saves an oauth token. This allows nixpkgs-update
   to query the GitHub API.  Alternatively, if you prefer not to install
   and configure `hub`, you can manually create a GitHub token with
   `repo` and `gist` scopes.  Provide it to `nixpkgs-update` by
   exporting it as the `GITHUB_TOKEN` environment variable
   (`nixpkgs-update` reads credentials from the files `hub` uses but
   no longer uses `hub` itself).

2. Clone this repository and build `nixpkgs-update`:
    ```bash
    git clone https://github.com/nix-community/nixpkgs-update && cd nixpkgs-update
    nix-build
    ```

3. To test your config, try to update a single package, like this:

   ```bash
   ./result/bin/nixpkgs-update update "pkg oldVer newVer update-page"`

   # Example:
   ./result/bin/nixpkgs-update update "tflint 0.15.0 0.15.1 repology.org"`
   ```

   replacing `tflint` with the attribute name of the package you actually want
   to update, and the old version and new version accordingly.

   If this works, you are now setup to hack on `nixpkgs-update`! If
   you run it with `--pr`, it will actually send a pull request, which
   looks like this: https://github.com/NixOS/nixpkgs/pull/82465


4. If you'd like to send a batch of updates, get a list of outdated packages and
   place them in a `packages-to-update.txt` file:

  ```bash
  ./result/bin/nixpkgs-update fetch-repology > packages-to-update.txt
  ```

  There also exist alternative sources of updates, these include:

   - package updateScript:
      [passthru.updateScript](https://nixos.org/manual/nixpkgs/unstable/#var-passthru-updateScript)
   - GitHub releases:
     [nixpkgs-update-github-releases](https://github.com/synthetica9/nixpkgs-update-github-releases)

5. Run the tool in batch mode with `update-list`:

  ```bash
  ./result/bin/nixpkgs-update update-list
  ```


================================================
FILE: doc/contact.md
================================================
# Contact {#contact}

Github: [https://github.com/nix-community/nixpkgs-update](https://github.com/nix-community/nixpkgs-update)

Matrix: [https://matrix.to/#/#nixpkgs-update:nixos.org](https://matrix.to/#/#nixpkgs-update:nixos.org)


================================================
FILE: doc/contributing.md
================================================
# Contributing {#contributing}

Incremental development:

```bash
nix-shell --run "cabal v2-repl"
```

Run the tests:

```bash
nix-shell --run "cabal v2-test"
```

Run a type checker in the background for quicker type checking feedback:

```bash
nix-shell --run "ghcid"
```

Run a type checker for the app code:

```bash
nix-shell --run 'ghcid -c "cabal v2-repl exe:nixpkgs-update"'
```

Run a type checker for the test code:

```bash
nix-shell --run 'ghcid -c "cabal v2-repl tests"'
```

Updating the Cabal file when adding new dependencies or options:

```bash
nix run nixpkgs#haskellPackages.hpack
```

Source files are formatted with [Ormolu](https://github.com/tweag/ormolu).

There is also a [Cachix cache](https://nix-community.cachix.org/) available for the dependencies of this program.


================================================
FILE: doc/details.md
================================================
# Details {#details}

Some of these features only apply to the update-list sub-command or to
features only available to the @r-ryantm bot.

## Checks

A number of checks are performed to help nixpkgs maintainers gauge the
likelihood that an update was successful. All the binaries are run with
various flags to see if they have a zero exit code and output the new
version number. The outpath directory tree is searched for files
containing the new version number. A directory tree and disk usage
listing is provided.


## Security report

Information from the National Vulnerability Database maintained by
NIST is compared against the current and updated package version. The
nixpkgs package name is matched with the Common Platform Enumeration
vendor, product, edition, software edition, and target software fields
to find candidate Common Vulnerabilities and Exposures (CVEs). The
CVEs are filtered by the matching the current and updated versions
with the CVE version ranges.

The general philosophy of the CVE search is to avoid false negatives,
which means we expect to generate many false positives. The false
positives can be carefully removed by manually created rules
implemented in the filter function in the NVDRules module.

If there are no CVE matches, the report is not shown. The report has
three parts: CVEs resolved by this update, CVEs introduced by this
update, and CVEs present in both version.

If you would like to report a problem with the security report, please
use the [nixpkgs-update GitHub
issues](https://github.com/nix-community/nixpkgs-update/issues).

The initial development of the security report was made possible by a
partnership with [Serokell](https://serokell.io/) and the [NLNet
Foundation](https://nlnet.nl/) through their [Next Generation Internet
Zero Discovery initiative](https://nlnet.nl/discovery/) (NGI0
Discovery). NGI0 Discovery is made possible with financial support
from the [European Commission](https://ec.europa.eu/).


## Rebuild report

The PRs made by nixpkgs-update say what packages need to be rebuilt if
the pull request is merged. This uses the same mechanism
[OfBorg](https://github.com/NixOS/ofborg) uses to put rebuild labels
on PRs. Not limited by labels, it can report the exact number of
rebuilds and list some of the attrpaths that would need to be rebuilt.


## PRs against staging

If a PR merge would cause more than 500 packages to be rebuilt, the PR
is made against staging.


## Logs

[Logs from r-ryantm's runs](https://nixpkgs-update-logs.nix-community.org/) are
available online. There are a lot of packages `nixpkgs-update`
currently has no hope of updating. Please dredge the logs to find out
why your pet package is not receiving updates.


## Cache

By serving the build outputs from
[https://nixpkgs-update-cache.nix-community.org/](https://nixpkgs-update-cache.nix-community.org/), nixpkgs-update allows you to
test a package with one command.


================================================
FILE: doc/donate.md
================================================
# Donate {#donate}

[@r-ryantm](https://github.com/r-ryantm), the bot that updates Nixpkgs, is currently running on a Hetzner bare-metal server that costs me €60 per month. Your support in paying for infrastructure would be a great help:

* [GitHub Sponsors](https://github.com/sponsors/ryantm)
* [Patreon](https://www.patreon.com/nixpkgsupdate)


================================================
FILE: doc/installation.md
================================================
# Installation {#installation}

::: note
For the Cachix cache to work, your user must be in the trusted-users
list or you can use sudo since root is effectively trusted.
:::

Run without installing on stable Nix:

```ShellSession
$ nix run \
  --option extra-substituters 'https://nix-community.cachix.org/' \
  --option extra-trusted-public-keys 'nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=' \
  -f https://github.com/nix-community/nixpkgs-update/archive/main.tar.gz \
  -c nixpkgs-update --help
```

Run without installing on unstable Nix with nix command enabled:

```ShellSession
$ nix shell \
  --option extra-substituters 'https://nix-community.cachix.org/' \
  --option extra-trusted-public-keys 'nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=' \
  -f https://github.com/nix-community/nixpkgs-update/archive/main.tar.gz \
  -c nixpkgs-update --help
```

Run without installing on unstable Nix with nix flakes enabled:

```ShellSession
$ nix run \
  --option extra-substituters 'https://nix-community.cachix.org/' \
  --option extra-trusted-public-keys 'nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=' \
  github:nix-community/nixpkgs-update -- --help
```

Install into your Nix profile:

```ShellSession
$ nix-env \
  --option extra-substituters 'https://nix-community.cachix.org/' \
  --option extra-trusted-public-keys 'nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=' \
  -if https://github.com/nix-community/nixpkgs-update/archive/main.tar.gz
```

Declaratively with [niv](https://github.com/nmattia/niv):

```ShellSession
$ niv add nix-community/nixpkgs-update
```

NixOS config with Niv:

```nix
let
  sources = import ./nix/sources.nix;
  nixpkgs-update = import sources.nixpkgs-update {};
in
  environment.systemPackages = [ nixpkgs-update ];
```

home-manager config with Niv:

```nix
let
  sources = import ./nix/sources.nix;
  nixpkgs-update = import sources.nixpkgs-update {};
in
  home.packages = [ nixpkgs-update ];
```


================================================
FILE: doc/interactive-updates.md
================================================
# Interactive updates {#interactive-updates}

nixpkgs-update supports interactive, single package updates via the
`update` subcommand.

# Update tutorial

1. Setup [`hub`](https://github.com/github/hub) and give it your
   GitHub credentials.  Alternatively, if you prefer not to install
   and configure `hub`, you can manually create a GitHub token with
   `repo` and `gist` scopes.  Provide it to `nixpkgs-update` by
   exporting it as the `GITHUB_TOKEN` environment variable
   (`nixpkgs-update` reads credentials from the files `hub` uses but
   no longer uses `hub` itself).
2. Go to your local checkout of nixpkgs, and **make sure the working
   directory is clean**. Be on a branch you are okay committing to.
3. Ensure that there is an Git origin called `upstream` which points to nixpkgs:
   ```sh
   git remote add upstream "https://github.com/NixOS/nixpkgs.git"
   ```
4. Run it like: `nixpkgs-update update "postman 7.20.0 7.21.2"`
   which mean update the package "postman" from version 7.20.0
   to version 7.21.2.
5. It will run the updater, and, if the update builds, it will commit
   the update and output a message you could use for a pull request.

# Flags

`--cve`

: adds CVE vulnerability reporting to the PR message. On
  first invocation with this option, a CVE database is
  built. Subsequent invocations will be much faster.

`--nixpkgs-review`

: runs [nixpkgs-review](https://github.com/Mic92/nixpkgs-review),
  which tries to build all the packages that depend on the one being
  updated and adds a report.


================================================
FILE: doc/introduction.md
================================================
# nixpkgs-update {#introduction}

> The future is here; let's evenly distribute it!

The [nixpkgs-update](https://github.com/nix-community/nixpkgs-update) mission
is to make [nixpkgs](https://github.com/nixos/nixpkgs) the most
up-to-date repository of software in the world by the most ridiculous
margin possible. [Here's how we are doing so far](https://repology.org/repositories/graphs).

It provides an interactive tool for automating single package
updates. Given a package name, old version, and new version, it
updates the version, and fetcher hashes, makes a commit, and
optionally a pull request. Along the way, it does checks to make sure
the update has a baseline quality.

It is the code used by the GitHub bot
[@r-ryantm](https://github.com/r-ryantm) to automatically update
nixpkgs. It uses package repository information from
[Repology.org](https://repology.org/repository/nix_unstable), the
GitHub releases API, and the package passthru.updateScript to generate a lists of outdated
packages.


================================================
FILE: doc/nixpkgs-maintainer-faq.md
================================================
# Nixpkgs Maintainer FAQ {#nixpkgs-maintainer-faq}

## @r-ryantm opened a PR for my package, what do I do?

Thanks for being a maintainer. Hopefully, @r-ryantm will be able to save you some time!

1. Review the PR diff, making sure this update makes sense
   - sometimes updates go backward or accidentally use a dev version
2. Review upstream changelogs and commits
3. Follow the "Instructions to test this update" section of the PR to get the built program on your computer quickly
4. Make a GitHub Review approving or requesting changes. Include screenshots or other notes as appropriate.

## Why is @r-ryantm not updating my package? {#no-update}

There are lots of reasons a package might not be updated. You can usually figure out which one is the issue by looking at the [logs](https://nixpkgs-update-logs.nix-community.org/) or by asking the [maintainers](#contact).

### No new version information

r-ryantm gets its new version information from three sources:

* Repology - information from Repology is delayed because it only updates when there is an unstable channel release
* GitHub releases
* package passthru.updateScript

If none of these sources says the package is out of date, it will not attempt to update it.

### Disabling package updates

Updates can be disabled by adding a comment to the package:
```
# nixpkgs-update: no auto update
```
[Example in nixpkgs](https://github.com/NixOS/nixpkgs/blob/f2294037ad2b1345c5d9c2df0e81bdb00eab21f3/pkgs/applications/version-management/gitlab/gitlab-pages/default.nix#L7)

### Skiplist

We maintain a [Skiplist](https://github.com/nix-community/nixpkgs-update/blob/main/src/Skiplist.hs) of different things not to update. It is possible your package is triggering one of the skip criteria.

Python updates are skipped if they cause more than 100 rebuilds.

### Existing Open or Draft PR

If there is an existing PR with the exact title of `$attrPath: $oldVersion -> $newVersion`, it will not update the package.

### Version not newer

If Nix's `builtins.compareVersions` does not think the "new" version is newer, it will not update.

### Incompatibile with "Path Pin"

Some attrpaths have versions appended to the end of them, like `ruby_3_0`, the new version has to be compatible with this "Path pin". Here are some examples:

```Haskell
-- >>> versionCompatibleWithPathPin "libgit2_0_25" "0.25.3"
-- True
--
-- >>> versionCompatibleWithPathPin "owncloud90" "9.0.3"
-- True
--
-- >>> versionCompatibleWithPathPin "owncloud-client" "2.4.1"
-- True
--
-- >>> versionCompatibleWithPathPin "owncloud90" "9.1.3"
-- False
--
-- >>> versionCompatibleWithPathPin "nodejs-slim-10_x" "11.2.0"
-- False
--
-- >>> versionCompatibleWithPathPin "nodejs-slim-10_x" "10.12.0"
-- True
```

### Can't find derivation file

If `nix edit $attrpath` does not open the correct file that contains the version string and fetcher hash, the update will fail.

This might not work, for example, if a package doesn't have a `meta` attr (at all, or if the package uses a builder function that is discarding the `meta` attr).

### Update already merged

If the update is already on `master`, `staging`, or `staging-next`, the update will fail.

### Can't find hash or source url

If the derivation file has no hash or source URL, it will fail.

Since `nixpkgs-update` is trying to read these from `<pkg>.src`, this can also happen if the package's source is something unexpected such as another package. You can set the fallback `originalSrc` attr so that `nixpkgs-update` can find the correct source in cases like this.

### No updateScript and no version

If the derivation file has no version and no updateScript, it will fail.

### No changes

If the derivation "Rewriters" fail to change the derivation, it will fail.

If there is no updateScript, and the source url or the hash did not change, it will fail.

### No rebuilds

If the rewrites didn't cause any derivations to change, it will fail.

### Didn't build

If after the rewrites, it doesn't build, it will fail.


================================================
FILE: doc/nixpkgs-update.md
================================================


================================================
FILE: doc/nu.md
================================================
# nu

## Schema

## Table `package`

- id : int
- attrPath : string
- versionNixpkgsMaster : string
- versionNixpkgsStaging : string
- versionNixpkgsStagingNext : string
- versionRepology : string
- versionGitHub : string
- versionGitLab : string
- versionPypi : string
- projectRepology : string
- nixpkgsNameReplogy : string
- ownerGitHub : string
- repoGitHub : string
- ownerGitLab : string
- repoGitLab : string
- lastCheckedRepology : timestamp
- lastCheckedGitHub : timestamp
- lastCheckedGitLab : timestamp
- lastCheckedPyPi : timestamp
- lastCheckedPendingPR : timestamp
- lastUpdateAttempt : timestamp
- pendingPR : int
- pendingPROwner : string
- pendingPRBranchName : string
- lastUpdateLog : string

## Table `maintainer-package`

- id : int
- packageId : int
- maintainerId : int

## Table `maintainer`

- id : int
- gitHubName : string


================================================
FILE: doc/r-ryantm.md
================================================
# r-ryantm bot {#r-ryantm}

[@r-ryantm](https://github.com/r-ryantm), is a bot account that updates Nixpkgs by making PRs that bump a package to the latest version. It runs on [community-configured infrastructure](https://nix-community.org/update-bot/).


================================================
FILE: doc/toc.md
================================================
# nixpkgs-update

* [Introduction](#introduction)
* [Installation](#installation)
* [Interactive updates](#interactive-updates)
* [Batch updates](#batch-updates)
* [r-ryantm bot](#r-ryantm)
* [Details](#details)
* [Contributing](#contributing)
* [Donate](#donate)
* [Nixpkgs Maintainer FAQ](#nixpkgs-maintainer-faq)
* [Contact](#contact)


================================================
FILE: flake.nix
================================================
{
  description = "update nixpkgs automatically";

  inputs.mmdoc.url = "github:ryantm/mmdoc";
  inputs.mmdoc.inputs.nixpkgs.follows = "nixpkgs";

  inputs.treefmt-nix.url = "github:numtide/treefmt-nix";
  inputs.treefmt-nix.inputs.nixpkgs.follows = "nixpkgs";

  inputs.runtimeDeps.url = "github:NixOS/nixpkgs/nixos-unstable-small";

  nixConfig.extra-substituters = "https://nix-community.cachix.org";
  nixConfig.extra-trusted-public-keys = "nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=";

  outputs = { self, nixpkgs, mmdoc, treefmt-nix, runtimeDeps } @ args:
    let
      systems = [ "x86_64-linux" "aarch64-linux" "x86_64-darwin" "aarch64-darwin" ];
      eachSystem = f: nixpkgs.lib.genAttrs systems (system: f nixpkgs.legacyPackages.${system});
      treefmtEval = eachSystem (pkgs: treefmt-nix.lib.evalModule pkgs {
        projectRootFile = ".git/config";
        programs.ormolu.enable = true;
      });
    in
    {
      checks.x86_64-linux =
        let
          packages = nixpkgs.lib.mapAttrs' (n: nixpkgs.lib.nameValuePair "package-${n}") self.packages.x86_64-linux;
          devShells = nixpkgs.lib.mapAttrs' (n: nixpkgs.lib.nameValuePair "devShell-${n}") self.devShells.x86_64-linux;
        in
        packages // devShells // {
          treefmt = treefmtEval.x86_64-linux.config.build.check self;
        };

      formatter = eachSystem (pkgs: treefmtEval.${pkgs.system}.config.build.wrapper);

      packages.x86_64-linux = import ./pkgs/default.nix (args // { system = "x86_64-linux"; });
      devShells.x86_64-linux.default = self.packages."x86_64-linux".devShell;

      # nix flake check is broken for these when run on x86_64-linux
      # packages.x86_64-darwin = import ./pkgs/default.nix (args // { system = "x86_64-darwin"; });
      # devShells.x86_64-darwin.default = self.packages."x86_64-darwin".devShell;
    };
}


================================================
FILE: nixpkgs-update.cabal
================================================
cabal-version: 2.2

-- This file has been generated from package.yaml by hpack version 0.38.2.
--
-- see: https://github.com/sol/hpack
--
-- hash: 216d241b554fc46ae5c7bcb1605ea89df7993f629b942a8f243338d52cb49301

name:           nixpkgs-update
version:        0.4.0
synopsis:       Tool for semi-automatic updating of nixpkgs repository
description:    nixpkgs-update provides tools for updating of nixpkgs packages in a semi-automatic way. Mainly, it is used to run the GitHub bot @r-ryantm, but the underlying update mechanisms should be generally useful and in a later version should be exposed as a command-line tool.
category:       Web
homepage:       https://github.com/nix-community/nixpkgs-update#readme
bug-reports:    https://github.com/nix-community/nixpkgs-update/issues
author:         Ryan Mulligan et al.
maintainer:     ryan@ryantm.com
copyright:      2018-2022 Ryan Mulligan et al.
license:        CC0-1.0
license-file:   LICENSE
build-type:     Simple
extra-source-files:
    README.md

source-repository head
  type: git
  location: https://github.com/nix-community/nixpkgs-update

library
  exposed-modules:
      Check
      CVE
      Data.Hex
      DeleteMerged
      File
      GH
      Git
      Nix
      NixpkgsReview
      NVD
      NVDRules
      OurPrelude
      Outpaths
      Process
      Repology
      Rewrite
      Skiplist
      Update
      Utils
      Version
  hs-source-dirs:
      src
  default-extensions:
      DataKinds
      FlexibleContexts
      GADTs
      LambdaCase
      PolyKinds
      RankNTypes
      ScopedTypeVariables
      TypeApplications
      TypeFamilies
      TypeOperators
      BlockArguments
  ghc-options: -Wall -O2 -flate-specialise -fspecialise-aggressively -fplugin=Polysemy.Plugin
  build-depends:
      aeson
    , base >=4.13 && <5
    , bytestring
    , conduit
    , containers
    , cryptohash-sha256
    , directory
    , errors
    , filepath
    , github
    , http-client
    , http-client-tls
    , http-conduit
    , http-types
    , iso8601-time
    , lifted-base
    , mtl
    , neat-interpolation
    , optparse-applicative
    , parsec
    , parsers
    , partial-order
    , polysemy
    , polysemy-plugin
    , regex-applicative-text
    , servant
    , servant-client
    , sqlite-simple
    , template-haskell
    , temporary
    , text
    , th-env
    , time
    , transformers
    , typed-process
    , unix
    , unordered-containers
    , vector
    , versions
    , xdg-basedir
    , zlib
  default-language: Haskell2010

executable nixpkgs-update
  main-is: Main.hs
  hs-source-dirs:
      app
  default-extensions:
      DataKinds
      FlexibleContexts
      GADTs
      LambdaCase
      PolyKinds
      RankNTypes
      ScopedTypeVariables
      TypeApplications
      TypeFamilies
      TypeOperators
      BlockArguments
  ghc-options: -Wall -O2 -flate-specialise -fspecialise-aggressively -fplugin=Polysemy.Plugin
  build-depends:
      aeson
    , base >=4.13 && <5
    , bytestring
    , conduit
    , containers
    , cryptohash-sha256
    , directory
    , errors
    , filepath
    , github
    , http-client
    , http-client-tls
    , http-conduit
    , http-types
    , iso8601-time
    , lifted-base
    , mtl
    , neat-interpolation
    , nixpkgs-update
    , optparse-applicative
    , parsec
    , parsers
    , partial-order
    , polysemy
    , polysemy-plugin
    , regex-applicative-text
    , servant
    , servant-client
    , sqlite-simple
    , template-haskell
    , temporary
    , text
    , th-env
    , time
    , transformers
    , typed-process
    , unix
    , unordered-containers
    , vector
    , versions
    , xdg-basedir
    , zlib
  default-language: Haskell2010

test-suite spec
  type: exitcode-stdio-1.0
  main-is: Spec.hs
  other-modules:
      CheckSpec
      DoctestSpec
      UpdateSpec
      UtilsSpec
  hs-source-dirs:
      test
  default-extensions:
      DataKinds
      FlexibleContexts
      GADTs
      LambdaCase
      PolyKinds
      RankNTypes
      ScopedTypeVariables
      TypeApplications
      TypeFamilies
      TypeOperators
      BlockArguments
  ghc-options: -Wall -O2 -flate-specialise -fspecialise-aggressively -fplugin=Polysemy.Plugin
  build-depends:
      aeson
    , base >=4.13 && <5
    , bytestring
    , conduit
    , containers
    , cryptohash-sha256
    , directory
    , doctest
    , errors
    , filepath
    , github
    , hspec
    , hspec-discover
    , http-client
    , http-client-tls
    , http-conduit
    , http-types
    , iso8601-time
    , lifted-base
    , mtl
    , neat-interpolation
    , nixpkgs-update
    , optparse-applicative
    , parsec
    , parsers
    , partial-order
    , polysemy
    , polysemy-plugin
    , regex-applicative-text
    , servant
    , servant-client
    , sqlite-simple
    , template-haskell
    , temporary
    , text
    , th-env
    , time
    , transformers
    , typed-process
    , unix
    , unordered-containers
    , vector
    , versions
    , xdg-basedir
    , zlib
  default-language: Haskell2010


================================================
FILE: nixpkgs-update.nix
================================================
{ mkDerivation, aeson, base, bytestring, conduit, containers
, cryptohash-sha256, directory, doctest, errors, filepath, github
, hspec, hspec-discover, http-client, http-client-tls, http-conduit
, http-types, iso8601-time, lib, lifted-base, mtl
, neat-interpolation, optparse-applicative, parsec, parsers
, partial-order, polysemy, polysemy-plugin, regex-applicative-text
, servant, servant-client, sqlite-simple, template-haskell
, temporary, text, th-env, time, transformers, typed-process, unix
, unordered-containers, vector, versions, xdg-basedir, zlib
}:
mkDerivation {
  pname = "nixpkgs-update";
  version = "0.4.0";
  src = ./.;
  isLibrary = true;
  isExecutable = true;
  libraryHaskellDepends = [
    aeson base bytestring conduit containers cryptohash-sha256
    directory errors filepath github http-client http-client-tls
    http-conduit http-types iso8601-time lifted-base mtl
    neat-interpolation optparse-applicative parsec parsers
    partial-order polysemy polysemy-plugin regex-applicative-text
    servant servant-client sqlite-simple template-haskell temporary
    text th-env time transformers typed-process unix
    unordered-containers vector versions xdg-basedir zlib
  ];
  executableHaskellDepends = [
    aeson base bytestring conduit containers cryptohash-sha256
    directory errors filepath github http-client http-client-tls
    http-conduit http-types iso8601-time lifted-base mtl
    neat-interpolation optparse-applicative parsec parsers
    partial-order polysemy polysemy-plugin regex-applicative-text
    servant servant-client sqlite-simple template-haskell temporary
    text th-env time transformers typed-process unix
    unordered-containers vector versions xdg-basedir zlib
  ];
  testHaskellDepends = [
    aeson base bytestring conduit containers cryptohash-sha256
    directory doctest errors filepath github hspec hspec-discover
    http-client http-client-tls http-conduit http-types iso8601-time
    lifted-base mtl neat-interpolation optparse-applicative parsec
    parsers partial-order polysemy polysemy-plugin
    regex-applicative-text servant servant-client sqlite-simple
    template-haskell temporary text th-env time transformers
    typed-process unix unordered-containers vector versions xdg-basedir
    zlib
  ];
  testToolDepends = [ hspec-discover ];
  homepage = "https://github.com/nix-community/nixpkgs-update#readme";
  description = "Tool for semi-automatic updating of nixpkgs repository";
  license = lib.licenses.cc0;
  mainProgram = "nixpkgs-update";
}


================================================
FILE: package.yaml
================================================
name: nixpkgs-update
version: 0.4.0
synopsis: Tool for semi-automatic updating of nixpkgs repository
description: nixpkgs-update provides tools for updating of nixpkgs
  packages in a semi-automatic way. Mainly, it is used to run the GitHub
  bot @r-ryantm, but the underlying update mechanisms should be
  generally useful and in a later version should be exposed as a
  command-line tool.
license: CC0-1.0
author: Ryan Mulligan et al.
maintainer: ryan@ryantm.com
copyright: 2018-2022 Ryan Mulligan et al.
category: Web
extra-source-files:
- README.md

github: nix-community/nixpkgs-update

ghc-options: -Wall -O2 -flate-specialise -fspecialise-aggressively -fplugin=Polysemy.Plugin

default-extensions:
  - DataKinds
  - FlexibleContexts
  - GADTs
  - LambdaCase
  - PolyKinds
  - RankNTypes
  - ScopedTypeVariables
  - TypeApplications
  - TypeFamilies
  - TypeOperators
  - BlockArguments

dependencies:
  - aeson
  - base >= 4.13 && < 5
  - bytestring
  - conduit
  - containers
  - cryptohash-sha256
  - directory
  - errors
  - filepath
  - github
  - http-client
  - http-client-tls
  - http-conduit
  - http-types
  - iso8601-time
  - lifted-base
  - mtl
  - neat-interpolation
  - optparse-applicative
  - parsec
  - parsers
  - partial-order
  - polysemy
  - polysemy-plugin
  - regex-applicative-text
  - servant
  - servant-client
  - sqlite-simple
  - template-haskell
  - temporary
  - text
  - th-env
  - time
  - transformers
  - typed-process
  - unix
  - unordered-containers
  - vector
  - versions
  - xdg-basedir
  - zlib

library:
  source-dirs: src
  when:
    - condition: false
      other-modules: Paths_nixpkgs_update

tests:
  spec:
    main: Spec.hs
    source-dirs:
      - test
    dependencies:
      - hspec
      - hspec-discover
      - nixpkgs-update
      - doctest
    when:
      - condition: false
        other-modules: Paths_nixpkgs_update

executables:
  nixpkgs-update:
    source-dirs: app
    main: Main.hs
    dependencies:
      - nixpkgs-update
    when:
      - condition: false
        other-modules: Paths_nixpkgs_update


================================================
FILE: pkgs/default.nix
================================================
{ nixpkgs
, mmdoc
, runtimeDeps
, system
, self
, ...
}:

let

  runtimePkgs = import runtimeDeps { inherit system; };

  pkgs = import nixpkgs { inherit system; config = { allowBroken = true; }; };

  deps = with runtimePkgs; {
    NIX = nix;
    GIT = git;
    TREE = tree;
    GIST = gist;
    # TODO: are there more coreutils paths that need locking down?
    TIMEOUT = coreutils;
    NIXPKGSREVIEW = nixpkgs-review;
  };

  drvAttrs = attrs: deps;

  haskellPackages = pkgs.haskellPackages.override {
    overrides = _: haskellPackages: {
      polysemy-plugin = pkgs.haskell.lib.dontCheck haskellPackages.polysemy-plugin;
      polysemy = pkgs.haskell.lib.dontCheck haskellPackages.polysemy;
      http-api-data = pkgs.haskell.lib.doJailbreak haskellPackages.http-api-data;
      nixpkgs-update =
        pkgs.haskell.lib.justStaticExecutables (
          pkgs.haskell.lib.failOnAllWarnings (
            pkgs.haskell.lib.disableExecutableProfiling (
              pkgs.haskell.lib.disableLibraryProfiling (
                pkgs.haskell.lib.generateOptparseApplicativeCompletion "nixpkgs-update" (
                  (haskellPackages.callPackage ../nixpkgs-update.nix { }).overrideAttrs drvAttrs
                )
              )
            )
          )
        );
    };
  };

  shell = haskellPackages.shellFor {
    nativeBuildInputs = with pkgs; [
      cabal-install
      ghcid
      haskellPackages.cabal2nix
    ];
    packages = ps: [ ps.nixpkgs-update ];
    shellHook = ''
      export ${
        nixpkgs.lib.concatStringsSep " " (
          nixpkgs.lib.mapAttrsToList (name: value: ''${name}="${value}"'') deps
        )
      }
    '';
  };

  doc = pkgs.stdenvNoCC.mkDerivation rec {
    name = "nixpkgs-update-doc";
    src = self;
    phases = [ "mmdocPhase" ];
    mmdocPhase = "${mmdoc.packages.${system}.mmdoc}/bin/mmdoc nixpkgs-update $src/doc $out";
  };

in
{
  nixpkgs-update = haskellPackages.nixpkgs-update;
  default = haskellPackages.nixpkgs-update;
  nixpkgs-update-doc = doc;
  devShell = shell;
}


================================================
FILE: rust/.envrc
================================================
use flake
dotenv

================================================
FILE: rust/.gitignore
================================================
target
db.sqlite
.direnv

================================================
FILE: rust/Cargo.toml
================================================
[package]
name = "nixpkgs-update"
version = "0.1.0"
edition = "2021"

[dependencies]
chrono = "0.4.26"
diesel = { version = "2.1.0", features = ["sqlite", "chrono"] }
json = "0.12.4"
ureq = "2.7.1"


================================================
FILE: rust/diesel.toml
================================================
# For documentation on how to configure this file,
# see https://diesel.rs/guides/configuring-diesel-cli

[print_schema]
file = "src/schema.rs"
custom_type_derives = ["diesel::query_builder::QueryId"]

[migrations_directory]
dir = "migrations"


================================================
FILE: rust/flake.nix
================================================
{
  description = "update nixpkgs automatically";

  outputs = { self, nixpkgs } @ args: let
    pkgs = nixpkgs.legacyPackages.x86_64-linux;
  in {
    devShells.x86_64-linux.default = pkgs.mkShell {
      packages = [
        pkgs.cargo
        pkgs.clippy
        pkgs.sqlite
        pkgs.diesel-cli
        pkgs.openssl
        pkgs.pkg-config
        pkgs.rustfmt
      ];
    };
  };
}


================================================
FILE: rust/migrations/2023-08-12-152848_create_packages/down.sql
================================================
DROP table packages;


================================================
FILE: rust/migrations/2023-08-12-152848_create_packages/up.sql
================================================
CREATE TABLE packages (
    id TEXT PRIMARY KEY NOT NULL
  , attr_path TEXT NOT NULL
  , last_update_attempt DATETIME
  , last_update_log DATETIME
  , version_nixpkgs_master TEXT
  , last_checked_nixpkgs_master DATETIME
  , version_nixpkgs_staging TEXT
  , last_checked_nixpkgs_staging DATETIME
  , version_nixpkgs_staging_next TEXT
  , last_checked_nixpkgs_staging_next DATETIME
  , version_repology TEXT
  , project_repology TEXT
  , nixpkgs_name_replogy TEXT
  , last_checked_repology DATETIME
  , version_github TEXT
  , owner_github TEXT
  , repo_github TEXT
  , last_checked_github DATETIME
  , version_gitlab TEXT
  , owner_gitlab TEXT
  , repo_gitlab TEXT
  , last_checked_gitlab DATETIME
  , package_name_pypi TEXT
  , version_pypi TEXT
  , last_checked_pypi DATETIME
  , pending_pr INTEGER
  , pending_pr_owner TEXT
  , pending_pr_branch_name TEXT
  , last_checked_pending_pr DATETIME
)


================================================
FILE: rust/src/github.rs
================================================
use crate::nix;

fn token() -> Option<String> {
    if let Ok(token) = std::env::var("GH_TOKEN") {
        return Some(token);
    }
    if let Ok(token) = std::env::var("GITHUB_TOKEN") {
        return Some(token);
    }
    None
}

pub fn latest_release(github: &Github) -> Result<json::JsonValue, &'static str> {
    let mut request = ureq::get(&format!(
        "https://api.github.com/repos/{}/{}/releases/latest",
        github.owner, github.repo,
    ))
    .set("Accept", "application/vnd.github+json")
    .set("X-GitHub-Api-Version", "2022-11-28");

    if let Some(token) = token() {
        request = request.set("Authorization", &format!("Bearer {}", token));
    }

    let body = request.call().unwrap().into_string().unwrap();
    if let json::JsonValue::Object(response) = json::parse(&body).unwrap() {
        return Ok(response["tag_name"].clone());
    }
    Err("Couldn't find")
}

pub struct Github {
    pub owner: String,
    pub repo: String,
}

pub fn from(attr_path: &String) -> Option<Github> {
    let url = nix::eval("master", attr_path, "(drv: drv.src.url)").unwrap();
    if !url.contains("github") {
        return None;
    }
    let owner = nix::eval("master", attr_path, "(drv: drv.src.owner)").unwrap();
    let repo = nix::eval("master", attr_path, "(drv: drv.src.repo)").unwrap();
    Some(Github {
        owner: owner.to_string(),
        repo: repo.to_string(),
    })
}


================================================
FILE: rust/src/lib.rs
================================================
pub mod models;
pub mod schema;

use diesel::prelude::*;
use std::env;

pub fn establish_connection() -> SqliteConnection {
    let database_url = env::var("DATABASE_URL").expect("DATABASE_URL must be set");
    SqliteConnection::establish(&database_url)
        .unwrap_or_else(|_| panic!("Error connecting to {}", database_url))
}


================================================
FILE: rust/src/main.rs
================================================
mod github;
mod nix;
mod repology;

use chrono::offset::Utc;
use diesel::prelude::*;
use nixpkgs_update::models::*;
use nixpkgs_update::*;

fn version_in_nixpkgs_branch(branch: &str, attr_path: &String) -> Option<String> {
    nix::eval(branch, attr_path, "(drv: drv.version)")
}

fn version_in_nixpkgs_master(attr_path: &String) -> Option<String> {
    version_in_nixpkgs_branch("master", attr_path)
}

fn version_in_nixpkgs_staging(attr_path: &String) -> Option<String> {
    version_in_nixpkgs_branch("staging", attr_path)
}

fn version_in_nixpkgs_staging_next(attr_path: &String) -> Option<String> {
    version_in_nixpkgs_branch("staging-next", attr_path)
}

fn main() {
    use nixpkgs_update::schema::packages::dsl::*;

    let connection = &mut establish_connection();
    let results: Vec<Package> = packages.load(connection).expect("Error loading packages");

    println!("Displaying {} packages", results.len());
    for package in results {
        println!("{} {}", package.id, package.attr_path);

        let result: String = repology::latest_version(&package.attr_path)
            .unwrap()
            .to_string();
        println!("newest repology version {}", result);

        let version_in_nixpkgs_master: String =
            version_in_nixpkgs_master(&package.attr_path).unwrap();
        println!("nixpkgs master version {}", version_in_nixpkgs_master);

        let version_in_nixpkgs_staging: String =
            version_in_nixpkgs_staging(&package.attr_path).unwrap();
        println!("nixpkgs staging version {}", version_in_nixpkgs_staging);

        let version_in_nixpkgs_staging_next: String =
            version_in_nixpkgs_staging_next(&package.attr_path).unwrap();
        println!(
            "nixpkgs staging_next version {}",
            version_in_nixpkgs_staging_next
        );

        let now = Some(Utc::now().naive_utc());
        diesel::update(packages.find(&package.id))
            .set((
                version_nixpkgs_master.eq(Some(version_in_nixpkgs_master)),
                last_checked_nixpkgs_master.eq(now),
                version_nixpkgs_staging.eq(Some(version_in_nixpkgs_staging)),
                last_checked_nixpkgs_staging.eq(now),
                version_nixpkgs_staging_next.eq(Some(version_in_nixpkgs_staging_next)),
                last_checked_nixpkgs_staging_next.eq(now),
                version_repology.eq(Some(result)),
                last_checked_repology.eq(now),
            ))
            .execute(connection)
            .unwrap();

        if let Some(github) = github::from(&package.attr_path) {
            println!("found github for {}", package.attr_path);

            let vgithub: String = github::latest_release(&github).unwrap().to_string();
            println!("version github {}", vgithub);
            let now = Some(Utc::now().naive_utc());
            diesel::update(packages.find(&package.id))
                .set((
                    version_github.eq(Some(vgithub)),
                    owner_github.eq(Some(github.owner)),
                    repo_github.eq(Some(github.repo)),
                    last_checked_github.eq(now),
                ))
                .execute(connection)
                .unwrap();
        }
    }
}


================================================
FILE: rust/src/models.rs
================================================
use chrono::NaiveDateTime;
use diesel::prelude::*;

#[derive(Queryable, Selectable)]
#[diesel(table_name = crate::schema::packages)]
#[diesel(check_for_backend(diesel::sqlite::Sqlite))]
pub struct Package {
    pub id: String,
    pub attr_path: String,
    pub last_update_attempt: Option<NaiveDateTime>,
    pub last_update_log: Option<NaiveDateTime>,
    pub version_nixpkgs_master: Option<String>,
    pub last_checked_nixpkgs_master: Option<NaiveDateTime>,
    pub version_nixpkgs_staging: Option<String>,
    pub last_checked_nixpkgs_staging: Option<NaiveDateTime>,
    pub version_nixpkgs_staging_next: Option<String>,
    pub last_checked_nixpkgs_staging_next: Option<NaiveDateTime>,
    pub project_repology: Option<String>,
    pub nixpkgs_name_replogy: Option<String>,
    pub version_repology: Option<String>,
    pub last_checked_repology: Option<NaiveDateTime>,
    pub owner_github: Option<String>,
    pub repo_github: Option<String>,
    pub version_github: Option<String>,
    pub last_checked_github: Option<NaiveDateTime>,
    pub owner_gitlab: Option<String>,
    pub repo_gitlab: Option<String>,
    pub version_gitlab: Option<String>,
    pub last_checked_gitlab: Option<NaiveDateTime>,
    pub package_name_pypi: Option<String>,
    pub version_pypi: Option<String>,
    pub last_checked_pypi: Option<NaiveDateTime>,
    pub pending_pr: Option<i32>,
    pub pending_pr_owner: Option<String>,
    pub pending_pr_branch_name: Option<String>,
    pub last_checked_pending_pr: Option<NaiveDateTime>,
}


================================================
FILE: rust/src/nix.rs
================================================
use std::process::Command;

pub fn eval(branch: &str, attr_path: &String, apply: &str) -> Option<String> {
    let output = Command::new("nix")
        .arg("eval")
        .arg("--raw")
        .arg("--refresh")
        .arg(&format!("github:nixos/nixpkgs/{}#{}", branch, attr_path))
        .arg("--apply")
        .arg(apply)
        .output();
    match output {
        Ok(output) => Some(String::from_utf8_lossy(&output.stdout).to_string()),
        Err(_) => None,
    }
}


================================================
FILE: rust/src/repology.rs
================================================
pub fn latest_version(project_name: &String) -> Result<json::JsonValue, &'static str> {
    let body = ureq::get(&format!(
        "https://repology.org/api/v1/project/{}",
        project_name
    ))
    .set("User-Agent", "nixpkgs-update")
    .call()
    .unwrap()
    .into_string()
    .unwrap();
    let json = json::parse(&body).unwrap();
    if let json::JsonValue::Array(projects) = json {
        for project in projects {
            if let json::JsonValue::Object(project_repo) = project {
                if project_repo["status"] == "newest" {
                    return Ok(project_repo.get("version").unwrap().clone());
                }
            }
        }
    }
    Err("Couldn't find")
}


================================================
FILE: rust/src/schema.rs
================================================
// @generated automatically by Diesel CLI.

diesel::table! {
    packages (id) {
        id -> Text,
        attr_path -> Text,
        last_update_attempt -> Nullable<Timestamp>,
        last_update_log -> Nullable<Timestamp>,
        version_nixpkgs_master -> Nullable<Text>,
        last_checked_nixpkgs_master -> Nullable<Timestamp>,
        version_nixpkgs_staging -> Nullable<Text>,
        last_checked_nixpkgs_staging -> Nullable<Timestamp>,
        version_nixpkgs_staging_next -> Nullable<Text>,
        last_checked_nixpkgs_staging_next -> Nullable<Timestamp>,
        version_repology -> Nullable<Text>,
        project_repology -> Nullable<Text>,
        nixpkgs_name_replogy -> Nullable<Text>,
        last_checked_repology -> Nullable<Timestamp>,
        version_github -> Nullable<Text>,
        owner_github -> Nullable<Text>,
        repo_github -> Nullable<Text>,
        last_checked_github -> Nullable<Timestamp>,
        version_gitlab -> Nullable<Text>,
        owner_gitlab -> Nullable<Text>,
        repo_gitlab -> Nullable<Text>,
        last_checked_gitlab -> Nullable<Timestamp>,
        package_name_pypi -> Nullable<Text>,
        version_pypi -> Nullable<Text>,
        last_checked_pypi -> Nullable<Timestamp>,
        pending_pr -> Nullable<Integer>,
        pending_pr_owner -> Nullable<Text>,
        pending_pr_branch_name -> Nullable<Text>,
        last_checked_pending_pr -> Nullable<Timestamp>,
    }
}


================================================
FILE: src/CVE.hs
================================================
{-# LANGUAGE NamedFieldPuns #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE RecordWildCards #-}

module CVE
  ( parseFeed,
    CPE (..),
    CPEMatch (..),
    CPEMatchRow (..),
    cpeMatches,
    CVE (..),
    CVEID,
    cveLI,
  )
where

import Data.Aeson
  ( FromJSON,
    Key,
    Object,
    eitherDecode,
    parseJSON,
    withObject,
    (.!=),
    (.:),
    (.:!),
  )
import Data.Aeson.Types (Parser, prependFailure)
import qualified Data.ByteString.Lazy.Char8 as BSL
import Data.List (intercalate)
import qualified Data.Text as T
import Data.Time.Clock (UTCTime)
import Database.SQLite.Simple (FromRow, ToRow, field, fromRow, toRow)
import Database.SQLite.Simple.ToField (toField)
import OurPrelude
import Utils (Boundary (..), VersionMatcher (..))

type CVEID = Text

data CVE = CVE
  { cveID :: CVEID,
    cveCPEMatches :: [CPEMatch],
    cveDescription :: Text,
    cvePublished :: UTCTime,
    cveLastModified :: UTCTime
  }
  deriving (Show, Eq, Ord)

-- | cve list item
cveLI :: CVE -> Bool -> Text
cveLI c patched =
  "- ["
    <> cveID c
    <> "](https://nvd.nist.gov/vuln/detail/"
    <> cveID c
    <> ")"
    <> p
  where
    p =
      if patched
        then " (patched)"
        else ""

data CPEMatch = CPEMatch
  { cpeMatchCPE :: CPE,
    cpeMatchVulnerable :: Bool,
    cpeMatchVersionMatcher :: VersionMatcher
  }
  deriving (Show, Eq, Ord)

instance FromRow CPEMatch where
  fromRow = do
    cpeMatchCPE <- fromRow
    let cpeMatchVulnerable = True
    cpeMatchVersionMatcher <- field
    pure CPEMatch {..}

-- This decodes an entire CPE string and related attributes, but we only use
-- cpeVulnerable, cpeProduct, cpeVersion and cpeMatcher.
data CPE = CPE
  { cpePart :: (Maybe Text),
    cpeVendor :: (Maybe Text),
    cpeProduct :: (Maybe Text),
    cpeVersion :: (Maybe Text),
    cpeUpdate :: (Maybe Text),
    cpeEdition :: (Maybe Text),
    cpeLanguage :: (Maybe Text),
    cpeSoftwareEdition :: (Maybe Text),
    cpeTargetSoftware :: (Maybe Text),
    cpeTargetHardware :: (Maybe Text),
    cpeOther :: (Maybe Text)
  }
  deriving (Eq, Ord)

instance Show CPE where
  show
    CPE
      { cpePart,
        cpeVendor,
        cpeProduct,
        cpeVersion,
        cpeUpdate,
        cpeEdition,
        cpeLanguage,
        cpeSoftwareEdition,
        cpeTargetSoftware,
        cpeTargetHardware,
        cpeOther
      } =
      "CPE {"
        <> (intercalate ", " . concat)
          [ cpeField "part" cpePart,
            cpeField "vendor" cpeVendor,
            cpeField "product" cpeProduct,
            cpeField "version" cpeVersion,
            cpeField "update" cpeUpdate,
            cpeField "edition" cpeEdition,
            cpeField "language" cpeLanguage,
            cpeField "softwareEdition" cpeSoftwareEdition,
            cpeField "targetSoftware" cpeTargetSoftware,
            cpeField "targetHardware" cpeTargetHardware,
            cpeField "other" cpeOther
          ]
        <> "}"
      where
        cpeField :: Show a => String -> Maybe a -> [String]
        cpeField _ Nothing = []
        cpeField name (Just value) = [name <> " = " <> show value]

instance ToRow CPE where
  toRow
    CPE
      { cpePart,
        cpeVendor,
        cpeProduct,
        cpeVersion,
        cpeUpdate,
        cpeEdition,
        cpeLanguage,
        cpeSoftwareEdition,
        cpeTargetSoftware,
        cpeTargetHardware,
        cpeOther
      } =
      fmap -- There is no toRow instance for a tuple this large
        toField
        [ cpePart,
          cpeVendor,
          cpeProduct,
          cpeVersion,
          cpeUpdate,
          cpeEdition,
          cpeLanguage,
          cpeSoftwareEdition,
          cpeTargetSoftware,
          cpeTargetHardware,
          cpeOther
        ]

instance FromRow CPE where
  fromRow = do
    cpePart <- field
    cpeVendor <- field
    cpeProduct <- field
    cpeVersion <- field
    cpeUpdate <- field
    cpeEdition <- field
    cpeLanguage <- field
    cpeSoftwareEdition <- field
    cpeTargetSoftware <- field
    cpeTargetHardware <- field
    cpeOther <- field
    pure CPE {..}

-- | Parse a @description_data@ subtree and return the concatenation of the
-- english descriptions.
parseDescription :: Object -> Parser Text
parseDescription o = do
  dData <- o .: "description_data"
  descriptions <-
    fmap concat $
      sequence $
        flip map dData $
          \dDatum -> do
            value <- dDatum .: "value"
            lang :: Text <- dDatum .: "lang"
            pure $
              case lang of
                "en" -> [value]
                _ -> []
  pure $ T.intercalate "\n\n" descriptions

instance FromJSON CVE where
  parseJSON =
    withObject "CVE" $ \o -> do
      cve <- o .: "cve"
      meta <- cve .: "CVE_data_meta"
      cveID <- meta .: "ID"
      prependFailure (T.unpack cveID <> ": ") $ do
        cfgs <- o .: "configurations"
        cveCPEMatches <- parseConfigurations cfgs
        cvePublished <- o .: "publishedDate"
        cveLastModified <- o .: "lastModifiedDate"
        description <- cve .: "description"
        cveDescription <- parseDescription description
        pure CVE {..}

instance ToRow CVE where
  toRow CVE {cveID, cveDescription, cvePublished, cveLastModified} =
    toRow (cveID, cveDescription, cvePublished, cveLastModified)

instance FromRow CVE where
  fromRow = do
    let cveCPEMatches = []
    cveID <- field
    cveDescription <- field
    cvePublished <- field
    cveLastModified <- field
    pure CVE {..}

splitCPE :: Text -> [Maybe Text]
splitCPE =
  map (toMaybe . T.replace "\a" ":") . T.splitOn ":" . T.replace "\\:" "\a"
  where
    toMaybe "*" = Nothing
    toMaybe x = Just x

instance FromJSON CPEMatch where
  parseJSON =
    withObject "CPEMatch" $ \o -> do
      t <- o .: "cpe23Uri"
      cpeMatchCPE <-
        case splitCPE t of
          [Just "cpe", Just "2.3", cpePart, cpeVendor, cpeProduct, cpeVersion, cpeUpdate, cpeEdition, cpeLanguage, cpeSoftwareEdition, cpeTargetSoftware, cpeTargetHardware, cpeOther] ->
            pure CPE {..}
          _ -> fail $ "unparsable cpe23Uri: " <> T.unpack t
      cpeMatchVulnerable <- o .: "vulnerable"
      vStartIncluding <- o .:! "versionStartIncluding"
      vEndIncluding <- o .:! "versionEndIncluding"
      vStartExcluding <- o .:! "versionStartExcluding"
      vEndExcluding <- o .:! "versionEndExcluding"
      startBoundary <-
        case (vStartIncluding, vStartExcluding) of
          (Nothing, Nothing) -> pure Unbounded
          (Just start, Nothing) -> pure (Including start)
          (Nothing, Just start) -> pure (Excluding start)
          (Just _, Just _) -> fail "multiple version starts"
      endBoundary <-
        case (vEndIncluding, vEndExcluding) of
          (Nothing, Nothing) -> pure Unbounded
          (Just end, Nothing) -> pure (Including end)
          (Nothing, Just end) -> pure (Excluding end)
          (Just _, Just _) -> fail "multiple version ends"
      cpeMatchVersionMatcher <-
        case (cpeVersion cpeMatchCPE, startBoundary, endBoundary) of
          (Just v, Unbounded, Unbounded) -> pure $ SingleMatcher v
          (Nothing, start, end) -> pure $ RangeMatcher start end
          _ ->
            fail
              ( "cpe_match has both version "
                  <> show (cpeVersion cpeMatchCPE)
                  <> " in cpe, and boundaries from "
                  <> show startBoundary
                  <> " to "
                  <> show endBoundary
              )
      pure (CPEMatch {..})

data CPEMatchRow
  = CPEMatchRow CVE CPEMatch

instance ToRow CPEMatchRow where
  toRow (CPEMatchRow CVE {cveID} CPEMatch {cpeMatchCPE, cpeMatchVersionMatcher}) =
    [toField $ Just cveID]
      ++ toRow cpeMatchCPE
      ++ [toField cpeMatchVersionMatcher]

instance FromRow CPEMatchRow where
  fromRow = do
    let cveCPEMatches = []
    let cveDescription = undefined
    let cvePublished = undefined
    let cveLastModified = undefined
    cveID <- field
    cpeM <- fromRow
    pure $ CPEMatchRow (CVE {..}) cpeM

cpeMatches :: [CVE] -> [CPEMatchRow]
cpeMatches = concatMap rows
  where
    rows cve = fmap (CPEMatchRow cve) (cveCPEMatches cve)

guardAttr :: (Eq a, FromJSON a, Show a) => Object -> Key -> a -> Parser ()
guardAttr object attribute expected = do
  actual <- object .: attribute
  unless (actual == expected) $
    fail $
      "unexpected "
        <> show attribute
        <> ", expected "
        <> show expected
        <> ", got "
        <> show actual

boundedMatcher :: VersionMatcher -> Bool
boundedMatcher (RangeMatcher Unbounded Unbounded) = False
boundedMatcher _ = True

-- Because complex boolean formulas can't be used to determine if a single
-- product/version is vulnerable, we simply use all leaves marked vulnerable.
parseNode :: Object -> Parser [CPEMatch]
parseNode node = do
  maybeChildren <- node .:! "children"
  case maybeChildren of
    Nothing -> do
      matches <- node .:! "cpe_match" .!= []
      pure $
        filter (cpeMatchVersionMatcher >>> boundedMatcher) $
          filter cpeMatchVulnerable matches
    Just children -> do
      fmap concat $ sequence $ map parseNode children

parseConfigurations :: Object -> Parser [CPEMatch]
parseConfigurations o = do
  guardAttr o "CVE_data_version" ("4.0" :: Text)
  nodes <- o .: "nodes"
  fmap concat $ sequence $ map parseNode nodes

parseFeed :: BSL.ByteString -> Either Text [CVE]
parseFeed = bimap T.pack cvefItems . eitherDecode

data CVEFeed = CVEFeed
  { cvefItems :: [CVE]
  }

instance FromJSON CVEFeed where
  parseJSON = withObject "CVEFeed" $ \o -> CVEFeed <$> o .: "CVE_Items"


================================================
FILE: src/Check.hs
================================================
{-# LANGUAGE ExtendedDefaultRules #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE QuasiQuotes #-}
{-# LANGUAGE TemplateHaskell #-}
{-# OPTIONS_GHC -fno-warn-type-defaults #-}

module Check
  ( result,
    -- exposed for testing:
    hasVersion,
    versionWithoutPath,
  )
where

import Control.Applicative (many)
import Data.Char (isDigit, isLetter)
import Data.Maybe (fromJust)
import qualified Data.Text as T
import qualified Data.Text.IO as T
import Language.Haskell.TH.Env (envQ)
import OurPrelude
import System.Exit ()
import Text.Regex.Applicative.Text (RE', (=~))
import qualified Text.Regex.Applicative.Text as RE
import Utils (UpdateEnv (..), nixBuildOptions)

default (T.Text)

treeBin :: String
treeBin = fromJust ($$(envQ "TREE") :: Maybe String) <> "/bin/tree"

procTree :: [String] -> ProcessConfig () () ()
procTree = proc treeBin

gistBin :: String
gistBin = fromJust ($$(envQ "GIST") :: Maybe String) <> "/bin/gist"

data BinaryCheck = BinaryCheck
  { filePath :: FilePath,
    zeroExitCode :: Bool,
    versionPresent :: Bool
  }

isWordCharacter :: Char -> Bool
isWordCharacter c = (isDigit c) || (isLetter c)

isNonWordCharacter :: Char -> Bool
isNonWordCharacter c = not (isWordCharacter c)

-- | Construct regex: /.*\b${version}\b.*/s
versionRegex :: Text -> RE' ()
versionRegex version =
  (\_ -> ())
    <$> ( (((many RE.anySym) <* (RE.psym isNonWordCharacter)) <|> (RE.pure ""))
            *> (RE.string version)
            <* ((RE.pure "") <|> ((RE.psym isNonWordCharacter) *> (many RE.anySym)))
        )

hasVersion :: Text -> Text -> Bool
hasVersion contents expectedVersion =
  isJust $ contents =~ versionRegex expectedVersion

checkTestsBuild :: Text -> IO Bool
checkTestsBuild attrPath = do
  let timeout = "30m"
  let args =
        [T.unpack timeout, "nix-build"]
          ++ nixBuildOptions
          ++ [ "-E",
               "{ config }: (import ./. { inherit config; })."
                 ++ (T.unpack attrPath)
                 ++ ".tests or {}"
             ]
  r <- runExceptT $ ourReadProcessInterleaved $ proc "timeout" args
  case r of
    Left errorMessage -> do
      T.putStrLn $ attrPath <> ".tests process failed with output: " <> errorMessage
      return False
    Right (exitCode, output) -> do
      case exitCode of
        ExitFailure 124 -> do
          T.putStrLn $ attrPath <> ".tests took longer than " <> timeout <> " and timed out. Other output: " <> output
          return False
        ExitSuccess -> return True
        _ -> return False

checkTestsBuildReport :: Bool -> Text
checkTestsBuildReport False =
  "\n> [!CAUTION]\n> A test defined in `passthru.tests` did not pass.\n"
checkTestsBuildReport True =
  "- The tests defined in `passthru.tests`, if any, passed"

versionWithoutPath :: String -> Text -> String
versionWithoutPath resultPath expectedVersion =
  -- We want to match expectedVersion, except when it is preceeded by
  -- the new store path (as wrappers contain the full store path which
  -- often includes the version)
  -- This can be done with negative lookbehind e.g
  -- /^(?<!${storePathWithoutVersion})${version}/
  -- Note we also escape the version with \Q/\E for grep -P
  let storePath = fromMaybe (T.pack resultPath) $ T.stripPrefix "/nix/store/" (T.pack resultPath)
   in case T.breakOn expectedVersion storePath of
        (_, "") ->
          -- no version in prefix, just match version
          "\\Q"
            <> T.unpack expectedVersion
            <> "\\E"
        (storePrefix, _) ->
          "(?<!\\Q"
            <> T.unpack storePrefix
            <> "\\E)\\Q"
            <> T.unpack expectedVersion
            <> "\\E"

foundVersionInOutputs :: Text -> String -> IO (Maybe Text)
foundVersionInOutputs expectedVersion resultPath =
  hush
    <$> runExceptT
      ( do
          let regex = versionWithoutPath resultPath expectedVersion
          (exitCode, _) <-
            proc "grep" ["-rP", regex, resultPath]
              & ourReadProcessInterleaved
          case exitCode of
            ExitSuccess ->
              return $
                "- found "
                  <> expectedVersion
                  <> " with grep in "
                  <> T.pack resultPath
                  <> "\n"
            _ -> throwE "grep did not find version in file names"
      )

foundVersionInFileNames :: Text -> String -> IO (Maybe Text)
foundVersionInFileNames expectedVersion resultPath =
  hush
    <$> runExceptT
      ( do
          (_, contents) <-
            shell ("find " <> resultPath) & ourReadProcessInterleaved
          (contents =~ versionRegex expectedVersion)
            & hoistMaybe
            & noteT (T.pack "Expected version not found")
          return $
            "- found "
              <> expectedVersion
              <> " in filename of file in "
              <> T.pack resultPath
              <> "\n"
      )

treeGist :: String -> IO (Maybe Text)
treeGist resultPath =
  hush
    <$> runExceptT
      ( do
          contents <- procTree [resultPath] & ourReadProcessInterleavedBS_
          g <-
            shell gistBin
              & setStdin (byteStringInput contents)
              & ourReadProcessInterleaved_
          return $ "- directory tree listing: " <> g <> "\n"
      )

duGist :: String -> IO (Maybe Text)
duGist resultPath =
  hush
    <$> runExceptT
      ( do
          contents <- proc "du" [resultPath] & ourReadProcessInterleavedBS_
          g <-
            shell gistBin
              & setStdin (byteStringInput contents)
              & ourReadProcessInterleaved_
          return $ "- du listing: " <> g <> "\n"
      )

result :: MonadIO m => UpdateEnv -> String -> m Text
result updateEnv resultPath =
  liftIO $ do
    let expectedVersion = newVersion updateEnv
    testsBuild <- checkTestsBuild (packageName updateEnv)
    someReports <-
      fromMaybe ""
        <$> foundVersionInOutputs expectedVersion resultPath
          <> foundVersionInFileNames expectedVersion resultPath
          <> treeGist resultPath
          <> duGist resultPath
    return $
      let testsBuildSummary = checkTestsBuildReport testsBuild
       in [interpolate|
              $testsBuildSummary
              $someReports
            |]


================================================
FILE: src/Data/Hex.hs
================================================
{-# LANGUAGE FlexibleInstances #-}
{-# LANGUAGE TypeSynonymInstances #-}

-----------------------------------------------------------------------------

-----------------------------------------------------------------------------

-- |
-- Module      :  Data.Hex
-- Copyright   :  (c) Taru Karttunen 2009
-- License     :  BSD-style
-- Maintainer  :  taruti@taruti.net
-- Stability   :  provisional
-- Portability :  portable
--
-- Convert strings into hexadecimal and back.
module Data.Hex (Hex (..)) where

import Control.Monad
import qualified Data.ByteString.Char8 as B
import qualified Data.ByteString.Lazy.Char8 as L

-- | Convert strings into hexadecimal and back.
class Hex t where
  -- | Convert string into hexadecimal.
  hex :: t -> t

  -- | Convert from hexadecimal and fail on invalid input.
  unhex :: MonadFail m => t -> m t

instance Hex String where
  hex = Prelude.concatMap w
    where
      w ch =
        let s = "0123456789ABCDEF"
            x = fromEnum ch
         in [s !! div x 16, s !! mod x 16]
  unhex [] = return []
  unhex (a : b : r) = do
    x <- c a
    y <- c b
    liftM (toEnum ((x * 16) + y) :) $ unhex r
  unhex [_] = fail "Non-even length"

c :: MonadFail m => Char -> m Int
c '0' = return 0
c '1' = return 1
c '2' = return 2
c '3' = return 3
c '4' = return 4
c '5' = return 5
c '6' = return 6
c '7' = return 7
c '8' = return 8
c '9' = return 9
c 'A' = return 10
c 'B' = return 11
c 'C' = return 12
c 'D' = return 13
c 'E' = return 14
c 'F' = return 15
c 'a' = return 10
c 'b' = return 11
c 'c' = return 12
c 'd' = return 13
c 'e' = return 14
c 'f' = return 15
c _ = fail "Invalid hex digit!"

instance Hex B.ByteString where
  hex = B.pack . hex . B.unpack
  unhex x = liftM B.pack $ unhex $ B.unpack x

instance Hex L.ByteString where
  hex = L.pack . hex . L.unpack
  unhex x = liftM L.pack $ unhex $ L.unpack x


================================================
FILE: src/DeleteMerged.hs
================================================
{-# LANGUAGE OverloadedStrings #-}

module DeleteMerged
  ( deleteDone,
  )
where

import qualified Data.Text.IO as T
import qualified GH
import qualified Git
import GitHub.Data (Name, Owner)
import OurPrelude

deleteDone :: Bool -> Text -> Name Owner -> IO ()
deleteDone delete githubToken ghUser = do
  result <-
    runExceptT $ do
      Git.fetch
      Git.cleanAndResetTo "master"
      refs <- ExceptT $ GH.closedAutoUpdateRefs (GH.authFromToken githubToken) ghUser
      let branches = fmap (\r -> ("auto-update/" <> r)) refs
      if delete
        then liftIO $ Git.deleteBranchesEverywhere branches
        else liftIO $ do
          T.putStrLn $ "Would delete these branches for " <> tshow ghUser <> ":"
          mapM_ (T.putStrLn . tshow) branches
  case result of
    Left e -> T.putStrLn e
    _ -> return ()


================================================
FILE: src/File.hs
================================================
{-# LANGUAGE BlockArguments #-}
{-# LANGUAGE LambdaCase #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE TemplateHaskell #-}

module File where

import qualified Data.Text as T
import Data.Text.IO as T
import OurPrelude
import Polysemy.Input
import Polysemy.Output

data File m a where
  Read :: FilePath -> File m Text
  Write :: FilePath -> Text -> File m ()

makeSem ''File

runIO ::
  Member (Embed IO) r =>
  Sem (File ': r) a ->
  Sem r a
runIO =
  interpret $ \case
    Read file -> embed $ T.readFile file
    Write file contents -> embed $ T.writeFile file contents

runPure ::
  [Text] ->
  Sem (File ': r) a ->
  Sem r ([Text], a)
runPure contentList =
  runOutputMonoid pure
    . runInputList contentList
    . reinterpret2 \case
      Read _file -> maybe "" id <$> input
      Write _file contents -> output contents

replace ::
  Member File r =>
  Text ->
  Text ->
  FilePath ->
  Sem r Bool
replace find replacement file = do
  contents <- File.read file
  let newContents = T.replace find replacement contents
  when (contents /= newContents) $ do
    File.write file newContents
  return $ contents /= newContents

replaceIO :: MonadIO m => Text -> Text -> FilePath -> m Bool
replaceIO find replacement file =
  liftIO $
    runFinal $
      embedToFinal $
        runIO $
          (replace find replacement file)


================================================
FILE: src/GH.hs
================================================
{-# LANGUAGE ExtendedDefaultRules #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE QuasiQuotes #-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE TupleSections #-}
{-# OPTIONS_GHC -fno-warn-type-defaults #-}

module GH
  ( releaseUrl,
    GH.untagName,
    authFromToken,
    checkExistingUpdatePR,
    closedAutoUpdateRefs,
    compareUrl,
    latestVersion,
    pr,
    prUpdate,
  )
where

import Control.Applicative (liftA2, some)
import Data.Aeson (FromJSON)
import Data.Bitraversable (bitraverse)
import qualified Data.Text as T
import qualified Data.Text.Encoding as T
import Data.Time.Clock (addUTCTime, getCurrentTime)
import qualified Data.Vector as V
import qualified Git
import qualified GitHub as GH
import GitHub.Data.Name (Name (..))
import Network.HTTP.Client (HttpException (..), HttpExceptionContent (..), responseStatus)
import Network.HTTP.Types.Status (statusCode)
import OurPrelude
import Text.Regex.Applicative.Text ((=~))
import qualified Text.Regex.Applicative.Text as RE
import Utils (UpdateEnv (..), Version)
import qualified Utils as U

default (T.Text)

gReleaseUrl :: MonadIO m => GH.Auth -> URLParts -> ExceptT Text m Text
gReleaseUrl auth (URLParts o r t) =
  ExceptT $
    bimap (T.pack . show) (GH.getUrl . GH.releaseHtmlUrl)
      <$> liftIO (GH.github auth (GH.releaseByTagNameR o r t))

releaseUrl :: MonadIO m => UpdateEnv -> Text -> ExceptT Text m Text
releaseUrl env url = do
  urlParts <- parseURL url
  gReleaseUrl (authFrom env) urlParts

pr :: MonadIO m => UpdateEnv -> Text -> Text -> Text -> Text -> ExceptT Text m (Bool, Text)
pr env title body prHead base = do
  tryPR `catchE` \case
    -- If creating the PR returns a 422, most likely cause is that the
    -- branch was deleted, so push it again and retry once.
    GH.HTTPError (HttpExceptionRequest _ (StatusCodeException r _))
      | statusCode (responseStatus r) == 422 ->
          Git.push env >> withExceptT (T.pack . show) tryPR
    e ->
      throwE . T.pack . show $ e
  where
    tryPR =
      ExceptT $
        fmap ((False,) . GH.getUrl . GH.pullRequestUrl)
          <$> ( liftIO $
                  ( GH.github
                      (authFrom env)
                      ( GH.createPullRequestR
                          (N "nixos")
                          (N "nixpkgs")
                          (GH.CreatePullRequest title body prHead base)
                      )
                  )
              )

prUpdate :: forall m. MonadIO m => UpdateEnv -> Text -> Text -> Text -> Text -> ExceptT Text m (Bool, Text)
prUpdate env title body prHead base = do
  let runRequest :: FromJSON a => GH.Request k a -> ExceptT Text m a
      runRequest = ExceptT . fmap (first (T.pack . show)) . liftIO . GH.github (authFrom env)
  let inNixpkgs f = f (N "nixos") (N "nixpkgs")

  prs <-
    runRequest $
      inNixpkgs GH.pullRequestsForR (GH.optionsHead prHead) GH.FetchAll

  case V.toList prs of
    [] -> pr env title body prHead base
    (_ : _ : _) -> throwE $ "Too many open PRs from " <> prHead
    [thePR] -> do
      let withExistingPR :: (GH.Name GH.Owner -> GH.Name GH.Repo -> GH.IssueNumber -> a) -> a
          withExistingPR f = inNixpkgs f (GH.simplePullRequestNumber thePR)

      _ <-
        runRequest $
          withExistingPR GH.updatePullRequestR $
            GH.EditPullRequest (Just title) Nothing Nothing Nothing Nothing

      _ <-
        runRequest $
          withExistingPR GH.createCommentR body

      return (True, GH.getUrl $ GH.simplePullRequestUrl thePR)

data URLParts = URLParts
  { owner :: GH.Name GH.Owner,
    repo :: GH.Name GH.Repo,
    tag :: Text
  }
  deriving (Show)

-- | Parse owner-repo-branch triplet out of URL
-- We accept URLs pointing to uploaded release assets
-- that are usually obtained with fetchurl, as well
-- as the generated archives that fetchFromGitHub downloads.
--
-- Examples:
--
-- >>> parseURLMaybe "https://github.com/blueman-project/blueman/releases/download/2.0.7/blueman-2.0.7.tar.xz"
-- Just (URLParts {owner = N "blueman-project", repo = N "blueman", tag = "2.0.7"})
--
-- >>> parseURLMaybe "https://github.com/arvidn/libtorrent/archive/libtorrent_1_1_11.tar.gz"
-- Just (URLParts {owner = N "arvidn", repo = N "libtorrent", tag = "libtorrent_1_1_11"})
--
-- >>> parseURLMaybe "https://gitlab.com/inkscape/lib2geom/-/archive/1.0/lib2geom-1.0.tar.gz"
-- Nothing
parseURLMaybe :: Text -> Maybe URLParts
parseURLMaybe url =
  let domain = RE.string "https://github.com/"
      slash = RE.sym '/'
      pathSegment = T.pack <$> some (RE.psym (/= '/'))
      extension = RE.string ".zip" <|> RE.string ".tar.gz"
      toParts n o = URLParts (N n) (N o)
      regex =
        ( toParts
            <$> (domain *> pathSegment)
            <* slash
            <*> pathSegment
            <*> (RE.string "/releases/download/" *> pathSegment)
            <* slash
            <* pathSegment
        )
          <|> ( toParts
                  <$> (domain *> pathSegment)
                  <* slash
                  <*> pathSegment
                  <*> (RE.string "/archive/" *> pathSegment)
                  <* extension
              )
   in url =~ regex

parseURL :: MonadIO m => Text -> ExceptT Text m URLParts
parseURL url =
  tryJust ("GitHub: " <> url <> " is not a GitHub URL.") (parseURLMaybe url)

compareUrl :: MonadIO m => Text -> Text -> ExceptT Text m Text
compareUrl urlOld urlNew = do
  oldParts <- parseURL urlOld
  newParts <- parseURL urlNew
  return $
    "https://github.com/"
      <> GH.untagName (owner newParts)
      <> "/"
      <> GH.untagName (repo newParts)
      <> "/compare/"
      <> tag oldParts
      <> "..."
      <> tag newParts

autoUpdateRefs :: GH.Auth -> GH.Name GH.Owner -> IO (Either Text (Vector (Text, GH.Name GH.GitCommit)))
autoUpdateRefs auth ghUser =
  GH.github auth (GH.referencesR ghUser "nixpkgs" GH.FetchAll)
    & ((fmap . fmapL) tshow)
    & ((fmap . fmapR) (fmap (liftA2 (,) (GH.gitReferenceRef >>> GH.untagName) (GH.gitReferenceObject >>> GH.gitObjectSha >>> N)) >>> V.mapMaybe (bitraverse (T.stripPrefix prefix) pure)))
  where
    prefix = "refs/heads/auto-update/"

openPRWithAutoUpdateRefFrom :: GH.Auth -> GH.Name GH.Owner -> Text -> IO (Either Text Bool)
openPRWithAutoUpdateRefFrom auth ghUser ref =
  GH.executeRequest
    auth
    ( GH.pullRequestsForR
        "nixos"
        "nixpkgs"
        (GH.optionsHead (GH.untagName ghUser <> ":" <> U.branchPrefix <> ref) <> GH.stateOpen)
        GH.FetchAll
    )
    <&> bimap (T.pack . show) (not . V.null)

commitIsOldEnoughToDelete :: GH.Auth -> GH.Name GH.Owner -> GH.Name GH.GitCommit -> IO Bool
commitIsOldEnoughToDelete auth ghUser sha = do
  now <- getCurrentTime
  let cutoff = addUTCTime (-30 * 60) now
  GH.executeRequest auth (GH.gitCommitR ghUser "nixpkgs" sha)
    <&> either (const False) ((< cutoff) . GH.gitUserDate . GH.gitCommitCommitter)

refShouldBeDeleted :: GH.Auth -> GH.Name GH.Owner -> (Text, GH.Name GH.GitCommit) -> IO Bool
refShouldBeDeleted auth ghUser (ref, sha) =
  liftA2
    (&&)
    (either (const False) not <$> openPRWithAutoUpdateRefFrom auth ghUser ref)
    (commitIsOldEnoughToDelete auth ghUser sha)

closedAutoUpdateRefs :: GH.Auth -> GH.Name GH.Owner -> IO (Either Text (Vector Text))
closedAutoUpdateRefs auth ghUser =
  runExceptT $ do
    aur :: Vector (Text, GH.Name GH.GitCommit) <- ExceptT $ GH.autoUpdateRefs auth ghUser
    ExceptT (Right . V.map fst <$> V.filterM (refShouldBeDeleted auth ghUser) aur)

authFromToken :: Text -> GH.Auth
authFromToken = GH.OAuth . T.encodeUtf8

authFrom :: UpdateEnv -> GH.Auth
authFrom = authFromToken . U.githubToken . options

checkExistingUpdatePR :: MonadIO m => UpdateEnv -> Text -> ExceptT Text m ()
checkExistingUpdatePR env attrPath = do
  searchResult <-
    ExceptT $
      liftIO $
        (GH.github (authFrom env) (GH.searchIssuesR search) GH.FetchAll)
          & fmap (first (T.pack . show))
  if T.length (openPRReport searchResult) == 0
    then return ()
    else
      throwE
        ( "There might already be an open PR for this update:\n"
            <> openPRReport searchResult
        )
  where
    title = U.prTitle env attrPath
    search = [interpolate|repo:nixos/nixpkgs $title |]
    openPRReport searchResult =
      GH.searchResultResults searchResult
        & V.filter (GH.issueClosedAt >>> isNothing)
        & V.filter (GH.issuePullRequest >>> isJust)
        & fmap report
        & V.toList
        & T.unlines
    report i = "- " <> GH.issueTitle i <> "\n  " <> tshow (GH.issueUrl i)

latestVersion :: MonadIO m => UpdateEnv -> Text -> ExceptT Text m Version
latestVersion env url = do
  urlParts <- parseURL url
  r <-
    fmapLT tshow $
      ExceptT $
        liftIO $
          GH.executeRequest (authFrom env) $
            GH.latestReleaseR (owner urlParts) (repo urlParts)
  return $ T.dropWhile (\c -> c == 'v' || c == 'V') (GH.releaseTagName r)


================================================
FILE: src/Git.hs
================================================
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE TemplateHaskell #-}

module Git
  ( findAutoUpdateBranchMessage,
    mergeBase,
    cleanAndResetTo,
    commit,
    deleteBranchesEverywhere,
    delete1,
    diff,
    diffFileNames,
    fetch,
    fetchIfStale,
    headRev,
    push,
    nixpkgsDir,
    setupNixpkgs,
    Git.show,
    worktreeAdd,
    worktreeRemove,
  )
where

import Control.Concurrent
import Control.Exception
import qualified Data.ByteString as BS
import qualified Data.ByteString.Lazy as BSL
import Data.Maybe (fromJust)
import qualified Data.Text as T
import qualified Data.Text.IO as T
import Data.Time.Clock (addUTCTime, getCurrentTime)
import qualified Data.Vector as V
import Language.Haskell.TH.Env (envQ)
import OurPrelude hiding (throw)
import System.Directory (doesDirectoryExist, doesFileExist, getCurrentDirectory, getModificationTime, setCurrentDirectory)
import System.Environment.XDG.BaseDir (getUserCacheDir)
import System.Exit ()
import System.IO.Error (tryIOError)
import System.Posix.Env (setEnv)
import Utils (Options (..), UpdateEnv (..), branchName, branchPrefix)

bin :: String
bin = fromJust ($$(envQ "GIT") :: Maybe String) <> "/bin/git"

procGit :: [String] -> ProcessConfig () () ()
procGit = proc bin

clean :: ProcessConfig () () ()
clean = silently $ procGit ["clean", "-fdx"]

worktreeAdd :: FilePath -> Text -> UpdateEnv -> IO ()
worktreeAdd path commitish updateEnv =
  runProcessNoIndexIssue_IO $ silently $ procGit ["worktree", "add", "-b", T.unpack (branchName updateEnv), path, T.unpack commitish]

worktreeRemove :: FilePath -> IO ()
worktreeRemove path = do
  exist <- doesDirectoryExist path
  if exist
    then runProcessNoIndexIssue_IO $ silently $ procGit ["worktree", "remove", "--force", path]
    else return ()

checkout :: Text -> Text -> ProcessConfig () () ()
checkout branch target =
  silently $ procGit ["checkout", "-B", T.unpack branch, T.unpack target]

reset :: Text -> ProcessConfig () () ()
reset target = silently $ procGit ["reset", "--hard", T.unpack target]

delete1 :: Text -> IO ()
delete1 bName = ignoreExitCodeException $ runProcessNoIndexIssue_IO (delete1' bName)

delete1' :: Text -> ProcessConfig () () ()
delete1' branch = delete [branch]

delete :: [Text] -> ProcessConfig () () ()
delete branches = silently $ procGit (["branch", "-D"] ++ fmap T.unpack branches)

deleteOrigin :: [Text] -> ProcessConfig () () ()
deleteOrigin branches =
  silently $ procGit (["push", "origin", "--delete"] ++ fmap T.unpack branches)

cleanAndResetTo :: MonadIO m => Text -> ExceptT Text m ()
cleanAndResetTo branch =
  let target = "upstream/" <> branch
   in do
        runProcessNoIndexIssue_ $ silently $ procGit ["reset", "--hard"]
        runProcessNoIndexIssue_ clean
        runProcessNoIndexIssue_ $ checkout branch target
        runProcessNoIndexIssue_ $ reset target
        runProcessNoIndexIssue_ clean

show :: MonadIO m => Text -> Text -> ExceptT Text m Text
show branch file =
  readProcessInterleavedNoIndexIssue_ $ silently $ procGit ["show", T.unpack ("remotes/upstream/" <> branch <> ":" <> file)]

diff :: MonadIO m => Text -> ExceptT Text m Text
diff branch = readProcessInterleavedNoIndexIssue_ $ procGit ["diff", T.unpack branch]

diffFileNames :: MonadIO m => Text -> ExceptT Text m [Text]
diffFileNames branch =
  readProcessInterleavedNoIndexIssue_ (procGit ["diff", T.unpack branch, "--name-only"])
    & fmapRT T.lines

staleFetchHead :: MonadIO m => m Bool
staleFetchHead =
  liftIO $ do
    nixpkgsGit <- getUserCacheDir "nixpkgs"
    let fetchHead = nixpkgsGit <> "/.git/FETCH_HEAD"
    oneHourAgo <- addUTCTime (fromInteger $ -60 * 60) <$> getCurrentTime
    e <- tryIOError $ getModificationTime fetchHead
    if isLeft e
      then do
        return True
      else do
        fetchedLast <- getModificationTime fetchHead
        return (fetchedLast < oneHourAgo)

fetchIfStale :: MonadIO m => ExceptT Text m ()
fetchIfStale = whenM staleFetchHead fetch

fetch :: MonadIO m => ExceptT Text m ()
fetch =
  runProcessNoIndexIssue_ $
    silently $
      procGit ["fetch", "-q", "--prune", "--multiple", "upstream", "origin"]

push :: MonadIO m => UpdateEnv -> ExceptT Text m ()
push updateEnv =
  runProcessNoIndexIssue_
    ( procGit
        ( [ "push",
            "--force",
            "--set-upstream",
            "origin",
            T.unpack (branchName updateEnv)
          ]
            ++ ["--dry-run" | not (doPR (options updateEnv))]
        )
    )

nixpkgsDir :: IO FilePath
nixpkgsDir = do
  inNixpkgs <- inNixpkgsRepo
  if inNixpkgs
    then getCurrentDirectory
    else getUserCacheDir "nixpkgs"

-- Setup a Nixpkgs clone in $XDG_CACHE_DIR/nixpkgs
-- Since we are going to have to fetch, git reset, clean, and commit, we setup a
-- cache dir to avoid destroying any uncommitted work the user may have in PWD.
setupNixpkgs :: Text -> IO ()
setupNixpkgs ghUser = do
  fp <- nixpkgsDir
  exists <- doesDirectoryExist fp
  unless exists $ do
    procGit ["clone", "--origin", "upstream", "https://github.com/NixOS/nixpkgs.git", fp]
      & runProcess_
    setCurrentDirectory fp
    procGit ["remote", "add", "origin", "https://github.com/" <> T.unpack ghUser <> "/nixpkgs.git"]
      -- requires that user has forked nixpkgs
      & runProcess_
  inNixpkgs <- inNixpkgsRepo
  unless inNixpkgs do
    setCurrentDirectory fp
    _ <- runExceptT fetchIfStale
    _ <- runExceptT $ cleanAndResetTo "master"
    return ()
  System.Posix.Env.setEnv "NIX_PATH" ("nixpkgs=" <> fp) True

mergeBase :: IO Text
mergeBase = do
  readProcessInterleavedNoIndexIssue_IO
    (procGit ["merge-base", "upstream/master", "upstream/staging"])
    & fmap T.strip

-- Return Nothing if a remote branch for this package doesn't exist. If a
-- branch does exist, return a Just of its last commit message.
findAutoUpdateBranchMessage :: MonadIO m => Text -> ExceptT Text m (Maybe Text)
findAutoUpdateBranchMessage pName = do
  remoteBranches <-
    readProcessInterleavedNoIndexIssue_ (procGit ["branch", "--remote", "--format=%(refname:short) %(subject)"])
      & fmapRT (T.lines >>> fmap (T.strip >>> T.breakOn " "))
  return $
    lookup ("origin/" <> branchPrefix <> pName) remoteBranches
      & fmap (T.drop 1)

inNixpkgsRepo :: IO Bool
inNixpkgsRepo = do
  currentDir <- getCurrentDirectory
  doesFileExist (currentDir <> "/nixos/release.nix")

commit :: MonadIO m => Text -> ExceptT Text m ()
commit ref =
  runProcessNoIndexIssue_ (procGit ["commit", "-am", T.unpack ref])

headRev :: MonadIO m => ExceptT Text m Text
headRev = T.strip <$> readProcessInterleavedNoIndexIssue_ (procGit ["rev-parse", "HEAD"])

deleteBranchesEverywhere :: Vector Text -> IO ()
deleteBranchesEverywhere branches = do
  let branchList = V.toList $ branches
  if null branchList
    then return ()
    else do
      result <- runExceptT $ runProcessNoIndexIssue_ (delete branchList)
      case result of
        Left error1 -> T.putStrLn $ tshow error1
        Right success1 -> T.putStrLn $ tshow success1
      result2 <- runExceptT $ runProcessNoIndexIssue_ (deleteOrigin branchList)
      case result2 of
        Left error2 -> T.putStrLn $ tshow error2
        Right success2 -> T.putStrLn $ tshow success2

runProcessNoIndexIssue_IO ::
  ProcessConfig () () () -> IO ()
runProcessNoIndexIssue_IO config = go
  where
    go = do
      (code, out, e) <- readProcess config
      case code of
        ExitFailure 128
          | "index.lock" `BS.isInfixOf` BSL.toStrict e -> do
              threadDelay 100000
              go
        ExitSuccess -> return ()
        ExitFailure _ -> throw $ ExitCodeException code config out e

runProcessNoIndexIssue_ ::
  MonadIO m => ProcessConfig () () () -> ExceptT Text m ()
runProcessNoIndexIssue_ config = tryIOTextET go
  where
    go = do
      (code, out, e) <- readProcess config
      case code of
        ExitFailure 128
          | "index.lock" `BS.isInfixOf` BSL.toStrict e -> do
              threadDelay 100000
              go
        ExitSuccess -> return ()
        ExitFailure _ -> throw $ ExitCodeException code config out e

readProcessInterleavedNoIndexIssue_ ::
  MonadIO m => ProcessConfig () () () -> ExceptT Text m Text
readProcessInterleavedNoIndexIssue_ config = tryIOTextET go
  where
    go = do
      (code, out) <- readProcessInterleaved config
      case code of
        ExitFailure 128
          | "index.lock" `BS.isInfixOf` BSL.toStrict out -> do
              threadDelay 100000
              go
        ExitSuccess -> return $ bytestringToText out
        ExitFailure _ -> throw $ ExitCodeException code config out out

readProcessInterleavedNoIndexIssue_IO ::
  ProcessConfig () () () -> IO Text
readProcessInterleavedNoIndexIssue_IO config = go
  where
    go = do
      (code, out) <- readProcessInterleaved config
      case code of
        ExitFailure 128
          | "index.lock" `BS.isInfixOf` BSL.toStrict out -> do
              threadDelay 100000
              go
        ExitSuccess -> return $ bytestringToText out
        ExitFailure _ -> throw $ ExitCodeException code config out out


================================================
FILE: src/NVD.hs
================================================
{-# LANGUAGE NamedFieldPuns #-}
{-# LANGUAGE OverloadedStrings #-}

module NVD
  ( withVulnDB,
    getCVEs,
    Connection,
    ProductID,
    Version,
    CVE,
    CVEID,
    UTCTime,
  )
where

import CVE
  ( CPEMatch (..),
    CPEMatchRow (..),
    CVE (..),
    CVEID,
    cpeMatches,
    parseFeed,
  )
import Codec.Compression.GZip (decompress)
import Control.Exception (SomeException, try)
import Crypto.Hash.SHA256 (hashlazy)
import qualified Data.ByteString.Lazy.Char8 as BSL
import Data.Hex (hex, unhex)
import Data.List (group)
import qualified Data.Text as T
import Data.Time.Calendar (toGregorian)
import Data.Time.Clock
  ( UTCTime,
    diffUTCTime,
    getCurrentTime,
    nominalDay,
    utctDay,
  )
import Data.Time.ISO8601 (parseISO8601)
import Database.SQLite.Simple
  ( Connection,
    Only (..),
    Query (..),
    execute,
    executeMany,
    execute_,
    query,
    withConnection,
    withTransaction,
  )
import qualified NVDRules
import Network.HTTP.Conduit (simpleHttp)
import OurPrelude
import System.Directory
  ( XdgDirectory (..),
    createDirectoryIfMissing,
    getXdgDirectory,
    removeFile,
  )
import Utils (ProductID, Version)
import Version (matchVersion)

-- | Either @recent@, @modified@, or any year since @2002@.
type FeedID = String

type Extension = String

type Timestamp = UTCTime

type Checksum = BSL.ByteString

type DBVersion = Int

data Meta
  = Meta Timestamp Checksum

-- | Database version the software expects. If the software version is
-- higher than the database version or the database has not been updated in more
-- than 7.5 days, the database will be deleted and rebuilt from scratch. Bump
-- this when the database layout changes or the build-time data filtering
-- changes.
softwareVersion :: DBVersion
softwareVersion = 2

getDBPath :: IO FilePath
getDBPath = do
  cacheDir <- getXdgDirectory XdgCache "nixpkgs-update"
  createDirectoryIfMissing True cacheDir
  pure $ cacheDir </> "nvd.sqlite3"

withDB :: (Connection -> IO a) -> IO a
withDB action = do
  dbPath <- getDBPath
  withConnection dbPath action

markUpdated :: Connection -> IO ()
markUpdated conn = do
  now <- getCurrentTime
  execute conn "UPDATE meta SET last_update = ?" [now]

-- | Rebuild the entire database, redownloading all data.
rebuildDB :: IO ()
rebuildDB = do
  dbPath <- getDBPath
  removeFile dbPath
  withConnection dbPath $ \conn -> do
    execute_ conn "CREATE TABLE meta (db_version int, last_update text)"
    execute
      conn
      "INSERT INTO meta VALUES (?, ?)"
      (softwareVersion, "1970-01-01 00:00:00" :: Text)
    execute_ conn $
      Query $
        T.unlines
          [ "CREATE TABLE cves (",
            "  cve_id text PRIMARY KEY,",
            "  description text,",
            "  published text,",
            "  modified text)"
          ]
    execute_ conn $
      Query $
        T.unlines
          [ "CREATE TABLE cpe_matches (",
            "  cve_id text REFERENCES cve,",
            "  part text,",
            "  vendor text,",
            "  product text,",
            "  version text,",
            "  \"update\" text,",
            "  edition text,",
            "  language text,",
            "  software_edition text,",
            "  target_software text,",
            "  target_hardware text,",
            "  other text,",
            "  matcher text)"
          ]
    execute_ conn "CREATE INDEX matchers_by_cve ON cpe_matches(cve_id)"
    execute_ conn "CREATE INDEX matchers_by_product ON cpe_matches(product)"
    execute_ conn "CREATE INDEX matchers_by_vendor ON cpe_matches(vendor)"
    execute_
      conn
      "CREATE INDEX matchers_by_target_software ON cpe_matches(target_software)"
    years <- allYears
    forM_ years $ updateFeed conn
    markUpdated conn

feedURL :: FeedID -> Extension -> String
feedURL feed ext =
  "https://nvd.nist.gov/feeds/json/cve/1.1/nvdcve-1.1-" <> feed <> ext

throwString :: String -> IO a
throwString = ioError . userError

throwText :: Text -> IO a
throwText = throwString . T.unpack

allYears :: IO [FeedID]
allYears = do
  now <- getCurrentTime
  let (year, _, _) = toGregorian $ utctDay now
  return $ map show [2002 .. year]

parseMeta :: BSL.ByteString -> Either T.Text Meta
parseMeta raw = do
  let splitLine = second BSL.tail . BSL.break (== ':') . BSL.takeWhile (/= '\r')
  let fields = map splitLine $ BSL.lines raw
  lastModifiedDate <-
    note "no lastModifiedDate in meta" $ lookup "lastModifiedDate" fields
  sha256 <- note "no sha256 in meta" $ lookup "sha256" fields
  timestamp <-
    note "invalid lastModifiedDate in meta" $
      parseISO8601 $
        BSL.unpack lastModifiedDate
  checksum <- note "invalid sha256 in meta" $ unhex sha256
  return $ Meta timestamp checksum

getMeta :: FeedID -> IO Meta
getMeta feed = do
  raw <- simpleHttp $ feedURL feed ".meta"
  either throwText pure $ parseMeta raw

getCVE :: Connection -> CVEID -> IO CVE
getCVE conn cveID_ = do
  cves <-
    query
      conn
      ( Query $
          T.unlines
            [ "SELECT cve_id, description, published, modified",
              "FROM cves",
              "WHERE cve_id = ?"
            ]
      )
      (Only cveID_)
  case cves of
    [cve] -> pure cve
    [] -> fail $ "no cve with id " <> (T.unpack cveID_)
    _ -> fail $ "multiple cves with id " <> (T.unpack cveID_)

getCVEs :: Connection -> ProductID -> Version -> IO [CVE]
getCVEs conn productID version = do
  matches :: [CPEMatchRow] <-
    query
      conn
      ( Query $
          T.unlines
            [ "SELECT",
              "  cve_id,",
              "  part,",
              "  vendor,",
              "  product,",
              "  version,",
              "  \"update\",",
              "  edition,",
              "  language,",
              "  software_edition,",
              "  target_software,",
              "  target_hardware,",
              "  other,",
              "  matcher",
              "FROM cpe_matches",
              "WHERE vendor = ? or product = ? or edition = ? or software_edition = ? or target_software = ?",
              "ORDER BY cve_id"
            ]
      )
      (productID, productID, productID, productID, productID)
  let cveIDs =
        map head $
          group $
            flip mapMaybe matches $
              \(CPEMatchRow cve cpeMatch) ->
                if matchVersion (cpeMatchVersionMatcher cpeMatch) version
                  then
                    if NVDRules.filter cve cpeMatch productID version
                      then Just (cveID cve)
                      else Nothing
                  else Nothing
  forM cveIDs $ getCVE conn

putCVEs :: Connection -> [CVE] -> IO ()
putCVEs conn cves = do
  withTransaction conn $ do
    executeMany
      conn
      "DELETE FROM cves WHERE cve_id = ?"
      (map (Only . cveID) cves)
    executeMany
      conn
      ( Query $
          T.unlines
            [ "INSERT INTO cves(cve_id, description, published, modified)",
              "VALUES (?, ?, ?, ?)"
            ]
      )
      cves
    executeMany
      conn
      "DELETE FROM cpe_matches WHERE cve_id = ?"
      (map (Only . cveID) cves)
    executeMany
      conn
      ( Query $
          T.unlines
            [ "INSERT INTO cpe_matches(",
              "  cve_id,",
              "  part,",
              "  vendor,",
              "  product,",
              "  version,",
              "  \"update\",",
              "  edition,",
              "  language,",
              "  software_edition,",
              "  target_software,",
              "  target_hardware,",
              "  other,",
              "  matcher)",
              "VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)"
            ]
      )
      (cpeMatches cves)

getDBMeta :: Connection -> IO (DBVersion, UTCTime)
getDBMeta conn = do
  rows <- query conn "SELECT db_version, last_update FROM meta" ()
  case rows of
    [meta] -> pure meta
    _ -> fail "failed to get meta information"

needsRebuild :: IO Bool
needsRebuild = do
  dbMeta <- try $ withDB getDBMeta
  currentTime <- getCurrentTime
  case dbMeta of
    Left (e :: SomeException) -> do
      putStrLn $ "rebuilding database because " <> show e
      pure True
    Right (dbVersion, t) ->
      pure $
        diffUTCTime currentTime t > (7.5 * nominalDay)
          || dbVersion /= softwareVersion

-- | Download a feed and store it in the database.
updateFeed :: Connection -> FeedID -> IO ()
updateFeed conn feedID = do
  putStrLn $ "Updating National Vulnerability Database feed (" <> feedID <> ")"
  json <- downloadFeed feedID
  parsedCVEs <- either throwText pure $ parseFeed json
  putCVEs conn parsedCVEs

-- | Update the vulnerability database and run an action with a connection to
-- it.
withVulnDB :: (Connection -> IO a) -> IO a
withVulnDB action = do
  rebuild <- needsRebuild
  when rebuild rebuildDB
  withDB $ \conn -> do
    (_, lastUpdate) <- withDB getDBMeta
    currentTime <- getCurrentTime
    when (diffUTCTime currentTime lastUpdate > (0.25 * nominalDay)) $ do
      updateFeed conn "modified"
      markUpdated conn
    action conn

-- | Update a feed if it's older than a maximum age and return the contents as
-- ByteString.
downloadFeed :: FeedID -> IO BSL.ByteString
downloadFeed feed = do
  Meta _ expectedChecksum <- getMeta feed
  compressed <- simpleHttp $ feedURL feed ".json.gz"
  let raw = decompress compressed
  let actualChecksum = BSL.fromStrict $ hashlazy raw
  when (actualChecksum /= expectedChecksum) $
    throwString $
      "wrong hash, expected: "
        <> BSL.unpack (hex expectedChecksum)
        <> " got: "
        <> BSL.unpack (hex actualChecksum)
  return raw


================================================
FILE: src/NVDRules.hs
================================================
{-# LANGUAGE OverloadedStrings #-}

module NVDRules where

import CVE (CPE (..), CPEMatch (..), CVE (..))
import Data.Char (isDigit)
import qualified Data.Text as T
import OurPrelude
import Text.Regex.Applicative.Text (RE', anySym, many, psym, (=~))
import Utils (Boundary (..), ProductID, Version, VersionMatcher (..))

-- Return False to discard CVE
filter :: CVE -> CPEMatch -> ProductID -> Version -> Bool
filter _ cpeMatch "socat" v
  | cpeUpdatePresentAndNotPartOfVersion cpeMatch v = False -- TODO consider if this rule should be applied to all packages
filter _ cpeMatch "uzbl" v
  | isNothing (v =~ yearRegex)
      && "2009.12.22"
      `anyVersionInfixOf` cpeMatchVersionMatcher cpeMatch =
      False
  | isNothing (v =~ yearRegex)
      && "2010.04.03"
      `anyVersionInfixOf` cpeMatchVersionMatcher cpeMatch =
      False
filter _ cpeMatch "go" v
  | "."
      `T.isInfixOf` v
      && "-"
      `anyVersionInfixOf` cpeMatchVersionMatcher cpeMatch =
      False
filter _ cpeMatch "terraform" _
  | cpeTargetSoftware (cpeMatchCPE cpeMatch) == Just "aws" = False
filter cve _ "tor" _
  | cveID cve == "CVE-2017-16541" = False
filter _ cpeMatch "arena" _
  | cpeVendor (cpeMatchCPE cpeMatch) == Just "rockwellautomation"
      || cpeVendor (cpeMatchCPE cpeMatch) == Just "openforis" =
      False
filter _ cpeMatch "thrift" _
  | cpeVendor (cpeMatchCPE cpeMatch) == Just "facebook" = False
filter _ cpeMatch "kanboard" _
  | cpeTargetSoftware (cpeMatchCPE cpeMatch) == Just "jenkins" = False
filter _cve _match _productID _version = True

anyVersionInfixOf :: Text -> VersionMatcher -> Bool
anyVersionInfixOf t (SingleMatcher v) = t `T.isInfixOf` v
anyVersionInfixOf t (RangeMatcher (Including v1) (Including v2)) =
  t `T.isInfixOf` v1 || t `T.isInfixOf` v2
anyVersionInfixOf t (RangeMatcher (Excluding v1) (Excluding v2)) =
  t `T.isInfixOf` v1 || t `T.isInfixOf` v2
anyVersionInfixOf t (RangeMatcher (Including v1) (Excluding v2)) =
  t `T.isInfixOf` v1 || t `T.isInfixOf` v2
anyVersionInfixOf t (RangeMatcher (Excluding v1) (Including v2)) =
  t `T.isInfixOf` v1 || t `T.isInfixOf` v2
anyVersionInfixOf t (RangeMatcher Unbounded (Including v)) = t `T.isInfixOf` v
anyVersionInfixOf t (RangeMatcher Unbounded (Excluding v)) = t `T.isInfixOf` v
anyVersionInfixOf t (RangeMatcher (Including v) Unbounded) = t `T.isInfixOf` v
anyVersionInfixOf t (RangeMatcher (Excluding v) Unbounded) = t `T.isInfixOf` v
anyVersionInfixOf _ (RangeMatcher Unbounded Unbounded) = False

-- Four digits at the start followed by any number of anything else
yearRegex :: RE' ()
yearRegex =
  void $
    psym isDigit <* psym isDigit <* psym isDigit <* psym isDigit <* many anySym

cpeUpdatePresentAndNotPartOfVersion :: CPEMatch -> Version -> Bool
cpeUpdatePresentAndNotPartOfVersion cpeMatch v =
  maybe
    False
    (\update -> not (update `T.isInfixOf` v))
    (cpeUpdate (cpeMatchCPE cpeMatch))


================================================
FILE: src/Nix.hs
================================================
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE TemplateHaskell #-}

module Nix
  ( assertNewerVersion,
    assertOldVersionOn,
    binPath,
    build,
    getAttr,
    getAttrString,
    getChangelog,
    getDerivationFile,
    getDescription,
    getHash,
    getHashFromBuild,
    getHomepage,
    getMaintainers,
    getPatches,
    getSrcUrl,
    hasPatchNamed,
    hasUpdateScript,
    lookupAttrPath,
    numberOfFetchers,
    numberOfHashes,
    resultLink,
    runUpdateScript,
    fakeHashMatching,
    version,
    Raw (..),
  )
where

import Data.Maybe (fromJust)
import qualified Data.Text as T
import qualified Git
import Language.Haskell.TH.Env (envQ)
import OurPrelude
import System.Exit ()
import qualified System.Process.Typed as TP
import Utils (UpdateEnv (..), nixBuildOptions, nixCommonOptions, srcOrMain)
import Prelude hiding (log)

binPath :: String
binPath = fromJust ($$(envQ "NIX") :: Maybe String) <> "/bin"

data Env = Env [(String, String)]

data Raw
  = Raw
  | NoRaw

data EvalOptions = EvalOptions Raw Env

rawOpt :: Raw -> [String]
rawOpt Raw = ["--raw"]
rawOpt NoRaw = []

nixEvalApply ::
  MonadIO m =>
  Text ->
  Text ->
  ExceptT Text m Text
nixEvalApply applyFunc attrPath =
  ourReadProcess_
    (proc (binPath <> "/nix") (["--extra-experimental-features", "nix-command", "--extra-experimental-features", "flakes", "eval", ".#" <> T.unpack attrPath, "--apply", T.unpack applyFunc]))
    & fmapRT (fst >>> T.strip)

nixEvalApplyRaw ::
  MonadIO m =>
  Text ->
  Text ->
  ExceptT Text m Text
nixEvalApplyRaw applyFunc attrPath =
  ourReadProcess_
    (proc (binPath <> "/nix") (["--extra-experimental-features", "nix-command", "--extra-experimental-features", "flakes", "eval", ".#" <> T.unpack attrPath, "--raw", "--apply", T.unpack applyFunc]))
    & fmapRT (fst >>> T.strip)

nixEvalExpr ::
  MonadIO m =>
  Text ->
  ExceptT Text m Text
nixEvalExpr expr =
  ourReadProcess_
    (proc (binPath <> "/nix") (["--extra-experimental-features", "nix-command", "eval", "--expr", T.unpack expr]))
    & fmapRT (fst >>> T.strip)

-- Error if the "new version" is actually newer according to nix
assertNewerVersion :: MonadIO m => UpdateEnv -> ExceptT Text m ()
assertNewerVersion updateEnv = do
  versionComparison <-
    nixEvalExpr
      ( "(builtins.compareVersions \""
          <> newVersion updateEnv
          <> "\" \""
          <> oldVersion updateEnv
          <> "\")"
      )
  case versionComparison of
    "1" -> return ()
    a ->
      throwE
        ( newVersion updateEnv
            <> " is not newer than "
            <> oldVersion updateEnv
            <> " according to Nix; versionComparison: "
            <> a
            <> " "
        )

-- This is extremely slow but gives us the best results we know of
lookupAttrPath :: MonadIO m => UpdateEnv -> ExceptT Text m Text
lookupAttrPath updateEnv =
  -- lookup attrpath by nix-env
  ( proc
      (binPath <> "/nix-env")
      ( [ "-qa",
          (packageName updateEnv <> "-" <> oldVersion updateEnv) & T.unpack,
          "-f",
          ".",
          "--attr-path"
        ]
          <> nixCommonOptions
      )
      & ourReadProcess_
      & fmapRT (fst >>> T.lines >>> head >>> T.words >>> head)
  )
    <|>
    -- if that fails, check by attrpath
    (getAttrString "name" (packageName updateEnv))
    & fmapRT (const (packageName updateEnv))

getDerivationFile :: MonadIO m => Text -> ExceptT Text m Text
getDerivationFile attrPath = do
  npDir <- liftIO $ Git.nixpkgsDir
  proc "env" ["EDITOR=echo", (binPath <> "/nix"), "--extra-experimental-features", "nix-command", "edit", attrPath & T.unpack, "-f", "."]
    & ourReadProcess_
    & fmapRT (fst >>> T.strip >>> T.stripPrefix (T.pack npDir <> "/") >>> fromJust)

-- Get an attribute that can be evaluated off a derivation, as in:
-- getAttr "cargoHash" "ripgrep" -> 0lwz661rbm7kwkd6mallxym1pz8ynda5f03ynjfd16vrazy2dj21
getAttr :: MonadIO m => Text -> Text -> ExceptT Text m Text
getAttr attr = srcOrMain (nixEvalApply ("p: p." <> attr))

getAttrString :: MonadIO m => Text -> Text -> ExceptT Text m Text
getAttrString attr = srcOrMain (nixEvalApplyRaw ("p: p." <> attr))

getHash :: MonadIO m => Text -> ExceptT Text m Text
getHash = getAttrString "drvAttrs.outputHash"

getMaintainers :: MonadIO m => Text -> ExceptT Text m Text
getMaintainers =
  nixEvalApplyRaw "p: let gh = m : m.github or \"\"; nonempty = s: s != \"\"; addAt = s: \"@\"+s; in builtins.concatStringsSep \" \" (map addAt (builtins.filter nonempty (map gh p.meta.maintainers or [])))"

readNixBool :: MonadIO m => ExceptT Text m Text -> ExceptT Text m Bool
readNixBool t = do
  text <- t
  case text of
    "true" -> return True
    "false" -> return False
    a -> throwE ("Failed to read expected nix boolean " <> a <> " ")

getChangelog :: MonadIO m => Text -> ExceptT Text m Text
getChangelog = nixEvalApplyRaw "p: p.meta.changelog or \"\""

getDescription :: MonadIO m => Text -> ExceptT Text m Text
getDescription = nixEvalApplyRaw "p: p.meta.description or \"\""

getHomepage :: MonadIO m => Text -> ExceptT Text m Text
getHomepage = nixEvalApplyRaw "p: p.meta.homepage or \"\""

getSrcUrl :: MonadIO m => Text -> ExceptT Text m Text
getSrcUrl =
  srcOrMain
    (nixEvalApplyRaw "p: builtins.elemAt p.drvAttrs.urls 0")

buildCmd :: Text -> ProcessConfig () () ()
buildCmd attrPath =
  silently $ proc (binPath <> "/nix-build") (nixBuildOptions ++ ["-A", attrPath & T.unpack])

log :: Text -> ProcessConfig () () ()
log attrPath = proc (binPath <> "/nix") (["--extra-experimental-features", "nix-command", "log", "-f", ".", attrPath & T.unpack] <> nixCommonOptions)

build :: MonadIO m => Text -> ExceptT Text m ()
build attrPath =
  (buildCmd attrPath & runProcess_ & tryIOTextET)
    <|> ( do
            _ <- buildFailedLog
            throwE "nix log failed trying to get build logs "
        )
  where
    buildFailedLog = do
      buildLog <-
        ourReadProcessInterleaved_ (log attrPath)
          & fmap (T.lines >>> reverse >>> take 30 >>> reverse >>> T.unlines)
      throwE ("nix build failed.\n" <> buildLog <> " ")

numberOfFetchers :: Text -> Int
numberOfFetchers derivationContents =
  countUp "fetchurl {" + countUp "fetchgit {" + countUp "fetchFromGitHub {"
  where
    countUp x = T.count x derivationContents

-- Sum the number of things that look like fixed-output derivation hashes
numberOfHashes :: Text -> Int
numberOfHashes derivationContents =
  sum $ map countUp ["sha256 =", "sha256=", "cargoHash =", "vendorHash =", "hash =", "npmDepsHash ="]
  where
    countUp x = T.count x derivationContents

assertOldVersionOn ::
  MonadIO m => UpdateEnv -> Text -> Text -> ExceptT Text m ()
assertOldVersionOn updateEnv branchName contents =
  tryAssert
    ("Old version " <> oldVersionPattern <> " not present in " <> branchName <> " derivation file with contents: " <> contents)
    (oldVersionPattern `T.isInfixOf` contents)
  where
    oldVersionPattern = oldVersion updateEnv <> "\""

resultLink :: MonadIO m => ExceptT Text m Text
resultLink =
  T.strip
    <$> ( ourReadProcessInterleaved_ "readlink ./result"
            <|> ourReadProcessInterleaved_ "readlink ./result-bin"
            <|> ourReadProcessInterleaved_ "readlink ./result-dev"
            <|> ourReadProcessInterleaved_ "readlink ./result-lib"
        )
    <|> throwE "Could not find result link. "

fakeHashMatching :: Text -> Text
fakeHashMatching oldHash =
  if "sha512-" `T.isPrefixOf` oldHash
    then "sha512-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=="
    else "sha256-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="

-- fixed-output derivation produced path '/nix/store/fg2hz90z5bc773gpsx4gfxn3l6fl66nw-source' with sha256 hash '0q1lsgc1621czrg49nmabq6am9sgxa9syxrwzlksqqr4dyzw4nmf' instead of the expected hash '0bp22mzkjy48gncj5vm9b7whzrggcbs5pd4cnb6k8jpl9j02dhdv'
getHashFromBuild :: MonadIO m => Text -> ExceptT Text m Text
getHashFromBuild =
  srcOrMain
    ( \attrPath -> do
        (exitCode, _, stderr) <- buildCmd attrPath & readProcess
        when (exitCode == ExitSuccess) $ throwE "build succeeded unexpectedly"
        let stdErrText = bytestringToText stderr
        let firstSplit = T.splitOn "got:    " stdErrText
        firstSplitSecondPart <-
          tryAt
            ("stderr did not split as expected full stderr was: \n" <> stdErrText)
            firstSplit
            1
        let secondSplit = T.splitOn "\n" firstSplitSecondPart
        tryHead
          ( "stderr did not split second part as expected full stderr was: \n"
              <> stdErrText
              <> "\nfirstSplitSecondPart:\n"
              <> firstSplitSecondPart
          )
          secondSplit
    )

version :: MonadIO m => ExceptT Text m Text
version = ourReadProcessInterleaved_ (proc (binPath <> "/nix") ["--version"])

getPatches :: MonadIO m => Text -> ExceptT Text m Text
getPatches =
  nixEvalApply "p: map (patch: patch.name) p.patches"

hasPatchNamed :: MonadIO m => Text -> Text -> ExceptT Text m Bool
hasPatchNamed attrPath name = do
  ps <- getPatches attrPath
  return $ name `T.isInfixOf` ps

hasUpdateScript :: MonadIO m => Text -> ExceptT Text m Bool
hasUpdateScript attrPath = do
  nixEvalApply
    "p: builtins.hasAttr \"updateScript\" p"
    attrPath
    & readNixBool

runUpdateScript :: MonadIO m => Text -> ExceptT Text m (ExitCode, Text)
runUpdateScript attrPath = do
  let timeout = "30m" :: Text
  (exitCode, output) <-
    ourReadProcessInterleaved $
      TP.setStdin (TP.byteStringInput "\n") $
        proc "timeout" [T.unpack timeout, "env", "NIXPKGS_ALLOW_UNFREE=1", "nix-shell", "maintainers/scripts/update.nix", "--argstr", "package", T.unpack attrPath]
  case exitCode of
    ExitFailure 124 -> do
      return (exitCode, "updateScript for " <> attrPath <> " took longer than " <> timeout <> " and timed out. Other output: " <> output)
    _ -> do
      return (exitCode, output)


================================================
FILE: src/NixpkgsReview.hs
================================================
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE TemplateHaskell #-}

module NixpkgsReview
  ( cacheDir,
    runReport,
  )
where

import Data.Maybe (fromJust)
import Data.Text as T
import qualified File as F
import Language.Haskell.TH.Env (envQ)
import OurPrelude
import Polysemy.Output (Output, output)
import qualified Process as P
import System.Directory (doesFileExist)
import System.Environment.XDG.BaseDir (getUserCacheDir)
import System.Exit ()
import qualified Utils
import Prelude hiding (log)

binPath :: String
binPath = fromJust ($$(envQ "NIXPKGSREVIEW") :: Maybe String) <> "/bin"

cacheDir :: IO FilePath
cacheDir = getUserCacheDir "nixpkgs-review"

revDir :: FilePath -> Text -> FilePath
revDir cache commit = cache <> "/rev-" <> T.unpack commit

run ::
  Members '[F.File, P.Process, Output Text, Embed IO] r =>
  FilePath ->
  Text ->
  Sem r Text
run cache commit =
  let timeout = "180m" :: Text
   in do
        -- TODO: probably just skip running nixpkgs-review if the directory
        -- already exists
        void $
          ourReadProcessInterleavedSem $
            proc "rm" ["-rf", revDir cache commit]
        (exitCode, _nixpkgsReviewOutput) <-
          ourReadProcessInterleavedSem $
            proc "timeout" [T.unpack timeout, (binPath <> "/nixpkgs-review"), "rev", T.unpack commit, "--no-shell"]
        case exitCode of
          ExitFailure 124 -> do
            output $ "[check][nixpkgs-review] took longer than " <> timeout <> " and timed out"
            return $ ":warning: nixpkgs-review took longer than " <> timeout <> " and timed out"
          _ -> do
            reportExists <- embed $ doesFileExist (revDir cache commit <> "/report.md")
            if reportExists
              then F.read $ (revDir cache commit) <> "/report.md"
              else do
                output $ "[check][nixpkgs-review] report.md does not exist"
                return $ ":x: nixpkgs-review failed"

-- Assumes we are already in nixpkgs dir
runReport :: (Text -> IO ()) -> Text -> IO Text
runReport log commit = do
  log "[check][nixpkgs-review]"
  c <- cacheDir
  msg <-
    runFinal
      . embedToFinal
      . F.runIO
      . P.runIO
      . Utils.runLog log
      $ NixpkgsReview.run c commit
  log msg
  return msg


================================================
FILE: src/OurPrelude.hs
================================================
{-# LANGUAGE PartialTypeSignatures #-}

module OurPrelude
  ( (>>>),
    (<|>),
    (<>),
    (</>),
    (<&>),
    (&),
    module Control.Error,
    module Control.Monad.Except,
    module Control.Monad.Trans.Class,
    module Control.Monad.IO.Class,
    module Data.Bifunctor,
    module System.Process.Typed,
    module Polysemy,
    module Polysemy.Error,
    ignoreExitCodeException,
    Set,
    Text,
    Vector,
    interpolate,
    tshow,
    tryIOTextET,
    whenM,
    ourReadProcess_,
    ourReadProcess_Sem,
    ourReadProcessInterleaved_,
    ourReadProcessInterleavedBS_,
    ourReadProcessInterleaved,
    ourReadProcessInterleavedSem,
    silently,
    bytestringToText,
  )
where

import Control.Applicative ((<|>))
import Control.Category ((>>>))
import Control.Error
import qualified Control.Exception
import Control.Monad.Except
import Control.Monad.IO.Class
import Control.Monad.Trans.Class
import Data.Bifunctor
import qualified Data.ByteString.Lazy as BSL
import Data.Function ((&))
import Data.Functor ((<&>))
import Data.Set (Set)
import Data.Text (Text, pack)
import qualified Data.Text.Encoding as T
import qualified Data.Text.Encoding.Error as T
import Data.Vector (Vector)
import Language.Haskell.TH.Quote
import qualified NeatInterpolation
import Polysemy
import Polysemy.Error hiding (note, try, tryJust)
import qualified Process as P
import System.Exit
import System.FilePath ((</>))
import System.Process.Typed

interpolate :: QuasiQuoter
interpolate = NeatInterpolation.text

tshow :: Show a => a -> Text
tshow = show >>> pack

tryIOTextET :: MonadIO m => IO a -> ExceptT Text m a
tryIOTextET = syncIO >>> fmapLT tshow

whenM :: Monad m => m Bool -> m () -> m ()
whenM c a = c >>= \res -> when res a

bytestringToText :: BSL.ByteString -> Text
bytestringToText = BSL.toStrict >>> (T.decodeUtf8With T.lenientDecode)

ourReadProcessInterleavedBS_ ::
  MonadIO m =>
  ProcessConfig stdin stdoutIgnored stderrIgnored ->
  ExceptT Text m BSL.ByteString
ourReadProcessInterleavedBS_ = readProcessInterleaved_ >>> tryIOTextET

ourReadProcess_ ::
  MonadIO m =>
  ProcessConfig stdin stdout stderr ->
  ExceptT Text m (Text, Text)
ourReadProcess_ = readProcess_ >>> tryIOTextET >>> fmapRT (\(stdout, stderr) -> (bytestringToText stdout, bytestringToText stderr))

ourReadProcess_Sem ::
  Members '[P.Process] r =>
  ProcessConfig stdin stdoutIgnored stderrIgnored ->
  Sem r (Text, Text)
ourReadProcess_Sem =
  P.read_ >>> fmap (\(stdout, stderr) -> (bytestringToText stdout, bytestringToText stderr))

ourReadProcessInterleaved_ ::
  MonadIO m =>
  ProcessConfig stdin stdoutIgnored stderrIgnored ->
  ExceptT Text m Text
ourReadProcessInterleaved_ =
  readProcessInterleaved_ >>> tryIOTextET >>> fmapRT bytestringToText

ourReadProcessInterleaved ::
  MonadIO m =>
  ProcessConfig stdin stdoutIgnored stderrIgnored ->
  ExceptT Text m (ExitCode, Text)
ourReadProcessInterleaved =
  readProcessInterleaved
    >>> tryIOTextET
    >>> fmapRT (\(a, b) -> (a, bytestringToText b))

ourReadProcessInterleavedSem ::
  Members '[P.Process] r =>
  ProcessConfig stdin stdoutIgnored stderrIgnored ->
  Sem r (ExitCode, Text)
ourReadProcessInterleavedSem =
  P.readInterleaved
    >>> fmap (\(a, b) -> (a, bytestringToText b))

silently :: ProcessConfig stdin stdout stderr -> ProcessConfig () () ()
silently = setStderr closed >>> setStdin closed >>> setStdout closed

ignoreExitCodeException :: IO () -> IO ()
ignoreExitCodeException a = Control.Exception.catch a (\(_e :: ExitCodeException) -> pure ())


================================================
FILE: src/Outpaths.hs
================================================
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE QuasiQuotes #-}

module Outpaths
  ( currentOutpathSet,
    currentOutpathSetUncached,
    ResultLine,
    dummyOutpathSetBefore,
    dummyOutpathSetAfter,
    packageRebuilds,
    numPackageRebuilds,
    outpathReport,
  )
where

import Data.List (sort)
import qualified Data.Set as S
import qualified Data.Text as T
import qualified Data.Text.IO as T
import qualified Data.Vector as V
import qualified Git
import OurPrelude
import qualified System.Directory
import qualified System.Posix.Files as F
import Text.Parsec (parse)
import Text.Parser.Char
import Text.Parser.Combinators
import qualified Utils

outPathsExpr :: Text
outPathsExpr =
  [interpolate|
{ checkMeta
, path ? ./.
}:
let
  lib = import (path + "/lib");
  hydraJobs = import (path + "/pkgs/top-level/release.nix")
    # Compromise: accuracy vs. resources needed for evaluation.
    # we only evaluate one architecture per OS as we most likely catch all
    # mass-rebuilds this way.
    {
      supportedSystems = [
        "x86_64-linux"
     ];

      nixpkgsArgs = {
        config = {
          allowUnfree = true;
          allowInsecurePredicate = x: true;
          checkMeta = checkMeta;

          handleEvalIssue = reason: errormsg:
            let
              fatalErrors = [
                "unknown-meta" "broken-outputs"
              ];
            in if builtins.elem reason fatalErrors
              then abort errormsg
              else true;

          inHydra = true;
        };
      };
    };
  nixosJobs = import (path + "/nixos/release.nix") {
    supportedSystems = [ "x86_64-linux" ];
  };
  recurseIntoAttrs = attrs: attrs // { recurseForDerivations = true; };

  # hydraJobs leaves recurseForDerivations as empty attrmaps;
  # that would break nix-env and we also need to recurse everywhere.
  tweak = lib.mapAttrs
    (name: val:
      if name == "recurseForDerivations" then true
      else if lib.isAttrs val && val.type or null != "derivation"
              then recurseIntoAttrs (tweak val)
      else val
    );

  # Some of these contain explicit references to platform(s) we want to avoid;
  # some even (transitively) depend on ~/.nixpkgs/config.nix (!)
  blacklist = [
    "tarball" "metrics" "manual"
    "darwin-tested" "unstable" "stdenvBootstrapTools"
    "moduleSystem" "lib-tests" # these just confuse the output
  ];

in
  tweak (
    (builtins.removeAttrs hydraJobs blacklist)
    // {
      nixosTests.simple = nixosJobs.tests.simple;
    }
  )
|]

outPath :: MonadIO m => ExceptT Text m Text
outPath = do
  cacheDir <- liftIO $ Utils.outpathCacheDir
  let outpathFile = (cacheDir </> "outpaths.nix")
  liftIO $ T.writeFile outpathFile outPathsExpr
  liftIO $ putStrLn "[outpaths] eval start"
  currentDir <- liftIO $ System.Directory.getCurrentDirectory
  result <-
    ourReadProcessInterleaved_ $
      proc
        "nix-env"
        [ "-f",
          outpathFile,
          "-qaP",
          "--no-name",
          "--out-path",
          "--arg",
          "path",
          currentDir,
          "--arg",
          "checkMeta",
          "true",
          "--show-trace"
        ]
  liftIO $ putStrLn "[outpaths] eval end"
  pure result

data Outpath = Outpath
  { mayName :: Maybe Text,
    storePath :: Text
  }
  deriving (Eq, Ord, Show)

data ResultLine = ResultLine
  { package :: Text,
    architecture :: Text,
    outpaths :: Vector Outpath
  }
  deriving (Eq, Ord, Show)

-- Example query result line:
-- testInput :: Text
-- testInput =
--   "haskellPackages.amazonka-dynamodb-streams.x86_64-linux                        doc=/nix/store/m4rpsc9nx0qcflh9ni6qdlg6hbkwpicc-amazonka-dynamodb-streams-1.6.0-doc;/nix/store/rvd4zydr22a7j5kgnmg5x6695c7bgqbk-amazonka-dynamodb-streams-1.6.0\nhaskellPackages.agum.x86_64-darwin                                            doc=/nix/store/n526rc0pa5h0krdzsdni5agcpvcd3cb9-agum-2.7-doc;/nix/store/s59r75svbjm724q5iaprq4mln5k6wcr9-agum-2.7"
currentOutpathSet :: MonadIO m => ExceptT Text m (Set ResultLine)
currentOutpathSet = do
  rev <- Git.headRev
  mayOp <- lift $ lookupOutPathByRev rev
  op <- case mayOp of
    Just paths -> pure paths
    Nothing -> do
      paths <- outPath
      dir <- Utils.outpathCacheDir
      let file = dir <> "/" <> T.unpack rev
      liftIO $ T.writeFile file paths
      pure paths
  parse parseResults "outpath" op & fmapL tshow & hoistEither

currentOutpathSetUncached :: MonadIO m => ExceptT Text m (Set ResultLine)
currentOutpathSetUncached = do
  op <- outPath
  parse parseResults "outpath" op & fmapL tshow & hoistEither

lookupOutPathByRev :: MonadIO m => Text -> m (Maybe Text)
lookupOutPathByRev rev = do
  dir <- Utils.outpathCacheDir
  let file = dir <> "/" <> T.unpack rev
  fileExists <- liftIO $ F.fileExist file
  case fileExists of
    False -> return Nothing
    True -> do
      paths <- liftIO $ readFile file
      return $ Just $ T.pack paths

dummyOutpathSetBefore :: Text -> Set ResultLine
dummyOutpathSetBefore attrPath = S.singleton (ResultLine attrPath "x86-64" (V.singleton (Outpath (Just "attrPath") "fakepath")))

dummyOutpathSetAfter :: Text -> Set ResultLine
dummyOutpathSetAfter attrPath = S.singleton (ResultLine attrPath "x86-64" (V.singleton (Outpath (Just "attrPath") "fakepath-edited")))

parseResults :: CharParsing m => m (Set ResultLine)
parseResults = S.fromList <$> parseResultLine `sepEndBy` newline

parseResultLine :: CharParsing m => m ResultLine
parseResultLine =
  ResultLine
    <$> (T.dropWhileEnd (== '.') <$> parseAttrpath)
    <*> parseArchitecture
    <* spaces
    <*> parseOutpaths

parseAttrpath :: CharParsing m => m Text
parseAttrpath = T.concat <$> many (try parseAttrpathPart)

parseAttrpathPart :: CharParsing m => m Text
parseAttrpathPart = T.snoc <$> (T.pack <$> many (noneOf ". ")) <*> char '.'

parseArchitecture :: CharParsing m => m Text
parseArchitecture = T.pack <$> many (noneOf " ")

parseOutpaths :: CharParsing m => m (Vector Outpath)
parseOutpaths = V.fromList <$> (parseOutpath `sepBy1` char ';')

parseOutpath :: CharParsing m => m Outpath
parseOutpath =
  Outpath
    <$> optional (try (T.pack <$> (many (noneOf "=\n") <* char '=')))
    <*> (T.pack <$> many (noneOf ";\n"))

packageRebuilds :: Set ResultLine -> Vector Text
packageRebuilds = S.toList >>> fmap package >>> sort >>> V.fromList >>> V.uniq

numPackageRebuilds :: Set ResultLine -> Int
numPackageRebuilds diff = V.length $ packageRebuilds diff

outpathReport :: Set ResultLine -> Text
outpathReport diff =
  let pkg = tshow $ V.length $ packageRebuilds diff
      firstFifty = T.unlines $ V.toList $ V.take 50 $ packageRebuilds diff
      numPaths = tshow $ S.size diff
   in [interpolate|
        $numPaths total rebuild path(s)

        $pkg package rebuild(s)

        First fifty rebuilds by attrpath
        $firstFifty
      |]


================================================
FILE: src/Process.hs
================================================
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE TemplateHaskell #-}

module Process where

import qualified Data.ByteString.Lazy as BSL
import Polysemy
import Polysemy.Input
import System.Exit (ExitCode (..))
import qualified System.Process.Typed as TP

data Process m a where
  Read_ :: TP.ProcessConfig stdin stdout stderr -> Process m (BSL.ByteString, BSL.ByteString)
  ReadInterleaved_ :: TP.ProcessConfig stdin stdout stderr -> Process m BSL.ByteString
  ReadInterleaved :: TP.ProcessConfig stdin stdout stderr -> Process m (ExitCode, BSL.ByteString)

makeSem ''Process

runIO ::
  Member (Embed IO) r =>
  Sem (Process ': r) a ->
  Sem r a
runIO =
  interpret $ \case
    Read_ config -> embed $ (TP.readProcess_ config)
    ReadInterleaved_ config -> embed $ (TP.readProcessInterleaved_ config)
    ReadInterleaved config -> embed $ (TP.readProcessInterleaved config)

runPure ::
  [BSL.ByteString] ->
  Sem (Process ': r) a ->
  Sem r a
runPure outputList =
  runInputList outputList
    . reinterpret \case
      Read_ _config -> do
        r <- maybe "" id <$> input
        return (r, "")
      ReadInterleaved_ _config -> maybe "" id <$> input
      ReadInterleaved _config -> do
        r <- maybe "" id <$> input
        return (ExitSuccess, r)


================================================
FILE: src/Repology.hs
================================================
{-# LANGUAGE DeriveAnyClass #-}
{-# LANGUAGE DeriveGeneric #-}
{-# LANGUAGE OverloadedStrings #-}

module Repology where

import Control.Applicative (liftA2)
import Control.Concurrent (threadDelay)
import Data.Aeson
import Data.HashMap.Strict
import Data.List
import Data.Proxy
import qualified Data.Text as T
import qualified Data.Text.IO
import qualified Data.Vector as V
import GHC.Generics
import Network.HTTP.Client (managerModifyRequest, newManager, requestHeaders)
import Network.HTTP.Client.TLS (tlsManagerSettings)
import OurPrelude
import Servant.API
import Servant.Client (BaseUrl (..), ClientM, Scheme (..), client, mkClientEnv, runClientM)
import System.IO

baseUrl :: BaseUrl
baseUrl = BaseUrl Https "repology.org" 443 "/api/v1"

rateLimit :: IO ()
rateLimit = threadDelay 2000000

type Project = Vector Package

-- compareProject :: Project -> Project -> Ordering
-- compareProject ps1 ps2 = compareProject' (ps1 V.!? 0) (ps2 V.!? 0)
--   where
--     compareProject' (Just p1) (Just p2) = compare (name p1) (name p2)
--     compareProject' Nothing (Just _) = LT
--     compareProject' (Just _) Nothing = GT
--     compareProject' _ _ = EQ

type Projects = HashMap Text Project

type API =
  "project" :> Capture "project_name" Text :> Get '[JSON] Project
    :<|> "projects" :> QueryParam "inrepo" Text :> QueryParam "outdated" Bool :> Get '[JSON] Projects
    :<|> "projects" :> Capture "name" Text :> QueryParam "inrepo" Text :> QueryParam "outdated" Bool :> Get '[JSON] Projects

data Package = Package
  { repo :: Text,
    srcname :: Maybe Text, -- corresponds to attribute path
    visiblename :: Text, -- corresponds to pname
    version :: Text,
    origversion :: Maybe Text,
    status :: Maybe Text,
    summary :: Maybe Text,
    categories :: Maybe (Vector Text),
    licenses :: Maybe (Vector Text)
  }
  deriving (Eq, Show, Generic, FromJSON)

api :: Proxy API
api = Proxy

project :: Text -> ClientM (Vector Package)
projects ::
  Maybe Text ->
  Maybe Bool ->
  ClientM Projects
projects' ::
  Text ->
  Maybe Text ->
  Maybe Bool ->
  ClientM Projects
project :<|> projects :<|> projects' = client api

-- type PagingResult = PagingResult (Vector Project, ClientM PagingResult)
-- projects :: Text -> ClientM PagingResult
-- projects n = do
--   m <- ms n
--   return (lastProjectName m, sortedProjects m)
lastProjectName :: Projects -> Maybe Text
lastProjectName = keys >>> sort >>> Prelude.reverse >>> headMay

-- sortedProjects :: Projects -> Vector Project
-- sortedProjects = elems >>> sortBy compareProject >>> V.fromList

nixRepo :: Text
nixRepo = "nix_unstable"

nixOutdated :: ClientM Projects
nixOutdated =
  projects
    (Just nixRepo)
    (Just True)

nextNixOutdated :: Text -> ClientM Projects
nextNixOutdated n =
  projects'
    n
    (Just nixRepo)
    (Just True)

outdatedForRepo :: Text -> Vector Package -> Maybe Package
outdatedForRepo r =
  V.find (\p -> (status p) == Just "outdated" && (repo p) == r)

newest :: Vector Package -> Maybe Package
newest = V.find (\p -> (status p) == Just "newest")

getUpdateInfo :: ClientM (Maybe Text, Bool, Vector (Text, (Package, Package)))
getUpdateInfo = do
  liftIO rateLimit
  outdated <- nixOutdated
  let nixNew = toList $ Data.HashMap.Strict.mapMaybe (liftA2 (liftA2 (,)) (outdatedForRepo nixRepo) newest) outdated
  let mLastName = lastProjectName outdated
  liftIO $ hPutStrLn stderr $ show mLastName
  liftIO $ hPutStrLn stderr $ show (size outdated)
  return (mLastName, size outdated /= 1, V.fromList nixNew)

--  let sorted = sortBy (\(p1,_) (p2,_) -> compare (name p1) (name p2)) nixNew
getNextUpdateInfo ::
  Text -> ClientM (Maybe Text, Bool, Vector (Text, (Package, Package)))
getNextUpdateInfo n = do
  liftIO rateLimit
  outdated <- nextNixOutdated n
  let nixNew = toList $ Data.HashMap.Strict.mapMaybe (liftA2 (liftA2 (,)) (outdatedForRepo nixRepo) newest) outdated
  let mLastName = lastProjectName outdated
  liftIO $ hPutStrLn stderr $ show mLastName
  liftIO $ hPutStrLn stderr $ show (size outdated)
  return (mLastName, size outdated /= 1, V.fromList nixNew)

-- Argument should be the Repology identifier of the project, not srcname/attrPath or visiblename/pname.
repologyUrl :: Text -> Text
repologyUrl projectName = "https://repology.org/project/" <> projectName <> "/versions"

--  let sorted = sortBy (\(p1,_) (p2,_) -> compare (name p1) (name p2)) nixNew
updateInfo :: (Text, (Package, Package)) -> Maybe Text
updateInfo (projectName, (outdated, newestP)) = do
  attrPath <- srcname outdated
  pure $ T.unwords [attrPath, version outdated, version newestP, repologyUrl projectName]

justs :: Vector (Maybe a) -> Vector a
justs = V.concatMap (maybeToList >>> V.fromList)

moreNixUpdateInfo ::
  (Maybe Text, Vector (Package, Package)) ->
  ClientM (Vector (Package, Package))
moreNixUpdateInfo (Nothing, acc) = do
  (mLastName, moreWork, newNix) <- getUpdateInfo
  liftIO $
    V.sequence_ $
      fmap Data.Text.IO.putStrLn $
        justs $
          fmap updateInfo newNix
  if moreWork
    then moreNixUpdateInfo (mLastName, fmap snd newNix V.++ acc)
    else return acc
moreNixUpdateInfo (Just n, acc) = do
  (mLastName, moreWork, newNix) <- getNextUpdateInfo n
  liftIO $
    V.sequence_ $
      fmap Data.Text.IO.putStrLn $
        justs $
          fmap updateInfo newNix
  if moreWork
    then moreNixUpdateInfo (mLastName, fmap snd newNix V.++ acc)
    else return acc

allNixUpdateInfo :: ClientM (Vector (Package, Package))
allNixUpdateInfo = moreNixUpdateInfo (Nothing, V.empty)

fetch :: IO ()
fetch = do
  hSetBuffering stdout LineBuffering
  hSetBuffering stderr LineBuffering
  liftIO $ hPutStrLn stderr "starting"
  let addUserAgent req = pure $ req {requestHeaders = ("User-Agent", "https://github.com/nix-community/nixpkgs-update") : requestHeaders req}
  manager' <- newManager tlsManagerSettings {managerModifyRequest = addUserAgent}
  e <- runClientM allNixUpdateInfo (mkClientEnv manager' baseUrl)
  case e of
    Left ce -> liftIO $ hPutStrLn stderr $ show ce
    Right _ -> liftIO $ hPutStrLn stderr $ "done"
  return ()


================================================
FILE: src/Rewrite.hs
================================================
{-# LANGUAGE MultiWayIf #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE RecordWildCards #-}
{-# LANGUAGE ViewPatterns #-}

module Rewrite
  ( Args (..),
    runAll,
    golangModuleVersion,
    rustCrateVersion,
    version,
    redirectedUrls,
  )
where

import qualified Data.Text as T
import Data.Text.Encoding as T
import Data.Text.Encoding.Error as T
import Data.Text.IO as T
import qualified File
import qualified Network.HTTP.Client as HTTP
import Network.HTTP.Types.Status (statusCode)
import qualified Nix
import OurPrelude
import System.Exit ()
import Utils (UpdateEnv (..))
import Prelude hiding (log)

{-
 This module contains rewrite functions that make some modification to the
 nix derivation. These are in the IO monad so that they can do things like
 re-run nix-build to recompute hashes, but morally they should just stick to
 editing the derivationFile for their one stated purpose.

 The return contract is:
 - If it makes a modification, it should return a simple message to attach to
   the pull request description to provide context or justification for code
   reviewers (e.g., a GitHub issue or RFC).
 - If it makes no modification, return None
 - If it throws an exception, nixpkgs-update will be aborted for the package and
   no other rewrite functions will run.

  TODO: Setup some unit tests for these!
-}
data Args = Args
  { updateEnv :: Utils.UpdateEnv,
    attrPath :: Text,
    derivationFile :: FilePath,
    derivationContents :: Text,
    hasUpdateScript :: Bool
  }

type Rewriter = (Text -> IO ()) -> Args -> ExceptT Text IO (Maybe Text)

type Plan = [(Text, Rewriter)]

plan :: Plan
plan =
  [ ("version", version),
    ("rustCrateVersion", rustCrateVersion),
    ("golangModuleVersion", golangModuleVersion),
    ("npmDepsVersion", npmDepsVersion),
    ("updateScript", updateScript)
    -- ("redirectedUrl", Rewrite.redirectedUrls)
  ]

runAll :: (Text -> IO ()) -> Args -> ExceptT Text IO [Text]
runAll log args = do
  msgs <- forM plan $ \(name, f) -> do
    let log' msg =
          if T.null name
            then log msg
            else log $ ("[" <> name <> "] ") <> msg
    lift $ log' "" -- Print initial empty message to signal start of rewriter
    f log' args
  return $ catMaybes msgs

--------------------------------------------------------------------------------
-- The canonical updater: updates the src attribute and recomputes the sha256
version :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)
version log args@Args {..} = do
  if
      | Nix.numberOfFetchers derivationContents > 1 || Nix.numberOfHashes derivationContents > 1 -> do
          lift $ log "generic version rewriter does not support multiple hashes"
          return Nothing
      | hasUpdateScript -> do
          lift $ log "skipping because derivation has updateScript"
          return Nothing
      | otherwise -> do
          srcVersionFix args
          lift $ log "updated version and sha256"
          return $ Just "Version update"

--------------------------------------------------------------------------------
-- Redirect homepage when moved.
redirectedUrls :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)
redirectedUrls log Args {..} = do
  homepage <- Nix.getHomepage attrPath
  response <- liftIO $ do
    manager <- HTTP.newManager HTTP.defaultManagerSettings
    request <- HTTP.parseRequest (T.unpack homepage)
    HTTP.httpLbs request manager
  let status = statusCode $ HTTP.responseStatus response
  if status `elem` [301, 308]
    then do
      lift $ log "Redirecting URL"
      let headers = HTTP.responseHeaders response
          location = lookup "Location" headers
      case location of
        Nothing -> do
          lift $ log "Server did not return a location"
          return Nothing
        Just ((T.decodeUtf8With T.lenientDecode) -> newHomepage) -> do
          _ <- File.replaceIO homepage newHomepage derivationFile
          lift $ log "Replaced homepage"
          return $
            Just $
              "Replaced homepage by "
                <> newHomepage
                <> " due http "
                <> (T.pack . show) status
    else do
      lift $ log "URL not redirected"
      return Nothing

--------------------------------------------------------------------------------
-- Rewrite Rust on rustPlatform.buildRustPackage
-- This is basically `version` above, but with a second pass to also update the
-- cargoHash vendor hash.
rustCrateVersion :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)
rustCrateVersion log args@Args {..} = do
  if
      | (not (T.isInfixOf "cargoHash" derivationContents)) -> do
          lift $ log "No cargoHash found"
          return Nothing
      | hasUpdateScript -> do
          lift $ log "skipping because derivation has updateScript"
          return Nothing
      | otherwise -> do
          -- This starts the same way `version` does, minus the assert
          srcVersionFix args
          -- But then from there we need to do this a second time for the cargoHash!
          oldCargoHash <- Nix.getAttrString "cargoHash" attrPath
          let fakeHash = Nix.fakeHashMatching oldCargoHash
          _ <- lift $ File.replaceIO oldCargoHash fakeHash derivationFile
          newCargoHash <- Nix.getHashFromBuild attrPath
          when (oldCargoHash == newCargoHash) $ throwE ("cargo hashes equal; no update necessary: " <> oldCargoHash)
          lift . log $ "Replacing cargoHash with " <> newCargoHash
          _ <- lift $ File.replaceIO fakeHash newCargoHash derivationFile
          -- Ensure the package actually builds and passes its tests
          Nix.build attrPath
          lift $ log "Finished updating Crate version and replacing hashes"
          return $ Just "Rust version update"

--------------------------------------------------------------------------------
-- Rewrite Golang packages with buildGoModule
-- This is basically `version` above, but with a second pass to also update the
-- vendorHash go vendor hash.
golangModuleVersion :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)
golangModuleVersion log args@Args {..} = do
  if
      | not (T.isInfixOf "buildGo" derivationContents && T.isInfixOf "vendorHash" derivationContents) -> do
          lift $ log "Not a buildGoModule package with vendorHash"
          return Nothing
      | hasUpdateScript -> do
          lift $ log "skipping because derivation has updateScript"
          return Nothing
      | otherwise -> do
          -- This starts the same way `version` does, minus the assert
          srcVersionFix args
          -- But then from there we need to do this a second time for the vendorHash!
          -- Note that explicit `null` cannot be coerced to a string by nix eval --raw
          oldVendorHash <- Nix.getAttr "vendorHash" attrPath
          lift . log $ "Found old vendorHash = " <> oldVendorHash
          original <- liftIO $ T.readFile derivationFile
          _ <- lift $ File.replaceIO oldVendorHash "null" derivationFile
          ok <- runExceptT $ Nix.build attrPath
          _ <-
            if isLeft ok
              then do
                _ <- liftIO $ T.writeFile derivationFile original
                let fakeHash = Nix.fakeHashMatching oldVendorHash
                _ <- lift $ File.replaceIO oldVendorHash ("\"" <> fakeHash <> "\"") derivationFile
                newVendorHash <- Nix.getHashFromBuild attrPath
                _ <- lift $ File.replaceIO fakeHash newVendorHash derivationFile
                -- Note that on some small bumps, this may not actually change if go.sum did not
                lift . log $ "Replaced vendorHash with " <> newVendorHash
              else do
                lift . log $ "Set vendorHash to null"
          -- Ensure the package actually builds and passes its tests
          Nix.build attrPath
          lift $ log "Finished updating vendorHash"
          return $ Just "Golang update"

--------------------------------------------------------------------------------
-- Rewrite NPM packages with buildNpmPackage
-- This is basically `version` above, but with a second pass to also update the
-- npmDepsHash vendor hash.
npmDepsVersion :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)
npmDepsVersion log args@Args {..} = do
  if
      | not (T.isInfixOf "npmDepsHash" derivationContents) -> do
          lift $ log "No npmDepsHash"
          return Nothing
      | hasUpdateScript -> do
          lift $ log "skipping because derivation has updateScript"
          return Nothing
      | otherwise -> do
          -- This starts the same way `version` does, minus the assert
          srcVersionFix args
          -- But then from there we need to do this a second time for the cargoHash!
          oldDepsHash <- Nix.getAttrString "npmDepsHash" attrPath
          let fakeHash = Nix.fakeHashMatching oldDepsHash
          _ <- lift $ File.replaceIO oldDepsHash fakeHash derivationFile
          newDepsHash <- Nix.getHashFromBuild attrPath
          when (oldDepsHash == newDepsHash) $ throwE ("deps hashes equal; no update necessary: " <> oldDepsHash)
          lift . log $ "Replacing npmDepsHash with " <> newDepsHash
          _ <- lift $ File.replaceIO fakeHash newDepsHash derivationFile
          -- Ensure the package actually builds and passes its tests
          Nix.build attrPath
          lift $ log "Finished updating NPM deps version and replacing hashes"
          return $ Just "NPM version update"

--------------------------------------------------------------------------------

-- Calls passthru.updateScript
updateScript :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)
updateScript log Args {..} = do
  if hasUpdateScript
    then do
      (exitCode, msg) <- Nix.runUpdateScript attrPath
      case exitCode of
        ExitSuccess -> do
          lift $ log "Success"
          lift $ log msg
          return $ Just "Ran passthru.UpdateScript"
        ExitFailure num -> do
          throwE $ "[updateScript] Failed with exit code " <> tshow num <> "\n" <> msg
    else do
      lift $ log "skipping because derivation has no updateScript"
      return Nothing

--------------------------------------------------------------------------------
-- Common helper functions and utilities
-- Helper to update version and src attributes, re-computing the sha256.
-- This is done by the generic version upgrader, but is also a sub-component of some of the others.
srcVersionFix :: MonadIO m => Args -> ExceptT Text m ()
srcVersionFix Args {..} = do
  let UpdateEnv {..} = updateEnv
  oldHash <- Nix.getHash attrPath
  _ <- lift $ File.replaceIO oldVersion newVersion derivationFile
  let fakeHash = Nix.fakeHashMatching oldHash
  _ <- lift $ File.replaceIO oldHash fakeHash derivationFile
  newHash <- Nix.getHashFromBuild attrPath
  when (oldHash == newHash) $ throwE "Hashes equal; no update necessary"
  _ <- lift $ File.replaceIO fakeHash newHash derivationFile
  return ()


================================================
FILE: src/Skiplist.hs
================================================
{-# LANGUAGE FlexibleContexts #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE RankNTypes #-}

module Skiplist
  ( packageName,
    content,
    attrPath,
    checkResult,
    python,
    skipOutpathCalc,
  )
where

import Data.Foldable (find)
import qualified Data.Text as T
import OurPrelude

type Skiplist = [(Text -> Bool, Text)]

type TextSkiplister m =
  (MonadError Text m) =>
  Text ->
  m ()

attrPath :: TextSkiplister m
attrPath = skiplister attrPathList

packageName :: TextSkiplister m
packageName name =
  if name == "elementary-xfce-icon-theme" -- https://github.com/nix-community/nixpkgs-update/issues/63
    then return ()
    else skiplister nameList name

content :: TextSkiplister m
content = skiplister contentList

checkResult :: TextSkiplister m
checkResult = skiplister checkResultList

skipOutpathCalc :: TextSkiplister m
skipOutpathCalc = skiplister skipOutpathCalcList

attrPathList :: Skiplist
attrPathList =
  [ prefix "lxqt" "Packages for lxqt are currently skipped.",
    prefix
      "altcoins.bitcoin"
      "@roconnor asked for a skip on this until something can be done with GPG signatures https://github.com/NixOS/nixpkgs/commit/77f3ac7b7638b33ab198330eaabbd6e0a2e751a9",
    eq "sqlite-interactive" "it is an override",
    eq "harfbuzzFull" "it is an override",
    prefix
      "luanti-"
      "luanti-server and luanti-client are different outputs for the luanti package",
    prefix
      "mate."
      "mate packages are upgraded in lockstep https://github.com/NixOS/nixpkgs/pull/50695#issuecomment-441338593",
    prefix
      "deepin"
      "deepin packages are upgraded in lockstep https://github.com/NixOS/nixpkgs/pull/52327#issuecomment-447684194",
    prefix
      "rocmPackages"
      "rocm packages are upgraded in lockstep https://github.com/NixOS/nixpkgs/issues/385294",
    prefix
      "monero-"
      "monero-cli and monero-gui packages are upgraded in lockstep",
    prefix
      "element-desktop"
      "@Ma27 asked to skip",
    prefix
      "element-web"
      "has to be updated along with element-desktop",
    prefix
      "keybinder"
      "it has weird tags. see nixpkgs-update#232",
    infixOf
      "pysc2"
      "crashes nixpkgs-update",
    infixOf
      "tornado"
      "python updatescript updates pinned versions",
    prefix
      "spire-"
      "spire-server and spire-agent are different outputs for spire package",
    eq "imagemagick_light" "same file and version as imagemagick",
    eq "imagemagickBig" "same file and version as imagemagick",
    eq "libheimdal" "alias of heimdal",
    eq "minio_legacy_fs" "@bachp asked to skip",
    eq "flint" "update repeatedly exceeded the 6h timeout",
    eq "keepmenu" "update repeatedly exceeded the 6h timeout",
    eq "klee" "update repeatedly exceeded the 6h timeout",
    eq "dumbpipe" "update repeatedly exceeded the 6h timeout",
    eq "python3Packages.aiosonic" "update repeatedly exceeded the 6h timeout",
    eq "python3Packages.guidata" "update repeatedly exceeded the 6h timeout",
    eq "router" "update repeatedly exceeded the 6h timeout",
    eq "wifite2" "update repeatedly exceeded the 6h timeout",
    eq "granian" "update repeatedly exceeded the 6h timeout",
    eq "python3Packages.granian" "update repeatedly exceeded the 6h timeout",
    eq "vlagent" "updates via victorialogs package",
    eq "vmagent" "updates via victoriametrics package",
    eq "qemu_full" "updates via qemu package",
    eq "qemu_kvm" "updates via qemu package",
    eq "qemu-user" "updates via qemu package",
    eq "qemu-utils" "updates via qemu package",
    eq "ollama-rocm" "only `ollama` is explicitly updated (defined in the same file)",
    eq "ollama-cuda" "only `ollama` is explicitly updated (defined in the same file)",
    eq "python3Packages.mmengine" "takes way too long to build",
    eq "bitwarden-directory-connector-cli" "src is aliased to bitwarden-directory-connector",
    eq "vaultwarden-mysql" "src is aliased to vaultwarden",
    eq "vaultwarden-postgresql" "src is aliased to vaultwarden",
    eq "dune" "same as dune_3",
    eq "kanata-with-cmd" "same src as kanata",
    eq "curlFull" "same as curl",
    eq "curlMinimal" "same as curl",
    eq
      "azure-sdk-for-cpp.curl"
      "same as curl",
    eq
      "yt-dlp-light"
      "updates via yt-dlp"
  ]

nameList :: Skiplist
nameList =
  [ prefix "r-" "we don't know how to find the attrpath for these",
    infixOf "jquery" "this isn't a real package",
    infixOf "google-cloud-sdk" "complicated package",
    infixOf "github-release" "complicated package",
    infixOf "perl" "currently don't know how to update perl",
    infixOf "cdrtools" "We keep downgrading this by accident.",
    infixOf "gst" "gstreamer plugins are kept in lockstep.",
    infixOf "electron" "multi-platform srcs in file.",
    infixOf "xfce" "@volth asked to not update xfce",
    infixOf "cmake-cursesUI-qt4UI" "Derivation file is complicated",
    infixOf "iana-etc" "@mic92 takes care of this package",
    infixOf
      "checkbashism"
      "needs to be fixed, see https://github.com/NixOS/nixpkgs/pull/39552",
    eq "isl" "multi-version long building package",
    infixOf "qscintilla" "https://github.com/nix-community/nixpkgs-update/issues/51",
    eq "itstool" "https://github.com/NixOS/nixpkgs/pull/41339",
    infixOf
      "virtualbox"
      "nixpkgs-update cannot handle updating the guest additions https://github.com/NixOS/nixpkgs/pull/42934",
    eq
      "avr-binutils"
      "https://github.com/NixOS/nixpkgs/pull/43787#issuecomment-408649537",
    eq
      "iasl"
      "two updates had to be reverted, https://github.com/NixOS/nixpkgs/pull/46272",
    eq
      "meson"
      "https://github.com/NixOS/nixpkgs/pull/47024#issuecomment-423300633",
    eq
      "burp"
      "skipped until better versioning schema https://github.com/NixOS/nixpkgs/pull/46298#issuecomment-419536301",
    eq "chromedriver" "complicated package",
    eq
      "gitlab-shell"
      "@globin asked to skip in https://github.com/NixOS/nixpkgs/pull/52294#issuecomment-447653417",
    eq
      "gitlab-workhorse"
      "@globin asked to skip in https://github.com/NixOS/nixpkgs/pull/52286#issuecomment-447653409",
    eq
      "gitlab-elasticsearch-indexer"
      "@yayayayaka asked to skip in https://github.com/NixOS/nixpkgs/pull/244074#issuecomment-1641657015",
    eq "reposurgeon" "takes way too long to build",
    eq "kodelife" "multiple system hashes need to be updated at once",
    eq "openbazaar" "multiple system hashes need to be updated at once",
    eq "stalwart-mail-enterprise" "stalwart-mail-enterprise follows stalwart-mail, that should be updated instead",
    eq "eaglemode" "build hangs or takes way too long",
    eq "autoconf" "@prusnak asked to skip",
    eq "abseil-cpp" "@andersk asked to skip",
    eq "_7zz-rar" "will be updated by _7zz proper",
    eq "ncbi-vdb" "updating this alone breaks sratoolkit",
    eq "sratoolkit" "tied to version of ncbi-vdb",
    eq "libsignal-ffi" "must match the version required by mautrix-signal",
    eq
      "floorp"
      "big package, does not update hashes correctly (https://github.com/NixOS/nixpkgs/pull/424715#issuecomment-3163626684)",
    eq
      "discord-ptb"
      "updates through discord only https://github.com/NixOS/nixpkgs/issues/468956",
    eq
      "discord-canary"
      "updates through discord only https://github.com/NixOS/nixpkgs/issues/468956",
    eq
      "discord-development"
      "updates through discord only https://github.com/NixOS/nixpkgs/issues/468956"
  ]

contentList :: Skiplist
contentList =
  [ infixOf "nixpkgs-update: no auto update" "Derivation file opts-out of auto-updates",
    infixOf "DO NOT EDIT" "Derivation file says not to edit it",
    infixOf "Do not edit!" "Derivation file says not to edit it",
    -- Skip packages that have special builders
    infixOf "buildRustCrate" "Derivation contains buildRustCrate",
    infixOf "buildRubyGem" "Derivation contains buildRubyGem",
    infixOf "bundlerEnv" "Derivation contains bundlerEnv",
    infixOf "buildPerlPackage" "Derivation contains buildPerlPackage",
    -- Specific skips for classes of packages
    infixOf "teams.gnome" "Do not update GNOME during a release cycle",
    infixOf "https://downloads.haskell.org/ghc/" "GHC packages are versioned per file"
  ]

checkResultList :: Skiplist
checkResultList =
  [ infixOf
      "busybox"
      "- busybox result is not automatically checked, because some binaries kill the shell",
    infixOf
      "gjs"
      "- gjs result is not automatically checked, because some tests take a long time to run",
    infixOf
      "casperjs"
      "- casperjs result is not automatically checked, because some tests take a long time to run",
    binariesStickAround "kicad",
    binariesStickAround "fcitx",
    binariesStickAround "x2goclient",
    binariesStickAround "gpg-agent",
    binariesStickAround "dirmngr",
    binariesStickAround "barrier",
    binariesStickAround "fail2ban",
    binariesStickAround "zed",
    binariesStickAround "haveged"
  ]

skipOutpathCalcList :: Skiplist
skipOutpathCalcList =
  [ eq "firefox-beta-bin-unwrapped" "master",
    eq "firefox-devedition-bin-unwrapped" "master",
    -- "firefox-release-bin-unwrapped" is unneeded here because firefox-bin is a dependency of other packages that Hydra doesn't ignore.
    prefix "linuxKernel.kernels" "master",
    eq "bmake" "staging" -- mass rebuild only on darwin
  ]

binariesStickAround :: Text -> (Text -> Bool, Text)
binariesStickAround name =
  infixOf name ("- " <> name <> " result is not automatically checked because some binaries stick around")

skiplister :: Skiplist -> TextSkiplister m
skiplister skiplist input = forM_ result throwError
  where
    result = snd <$> find (\(isSkiplisted, _) -> isSkiplisted input) skiplist

prefix :: Text -> Text -> (Text -> Bool, Text)
prefix part reason = ((part `T.isPrefixOf`), reason)

infixOf :: Text -> Text -> (Text -> Bool, Text)
infixOf part reason = ((part `T.isInfixOf`), reason)

eq :: Text -> Text -> (Text -> Bool, Text)
eq part reason = ((part ==), reason)

python :: Monad m => Int -> Text -> ExceptT Text m ()
python numPackageRebuilds derivationContents =
  tryAssert
    ( "Python package with too many package rebuilds "
        <> (T.pack . show) numPackageRebuilds
        <> "  > "
        <> tshow maxPackageRebuild
    )
    (not isPython || numPackageRebuilds <= maxPackageRebuild)
  where
    isPython = "buildPythonPackage" `T.isInfixOf` derivationContents
    maxPackageRebuild = 100


================================================
FILE: src/Update.hs
================================================
{-# LANGUAGE ExtendedDefaultRules #-}
{-# LANGUAGE FlexibleContexts #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE QuasiQuotes #-}
{-# LANGUAGE RecordWildCards #-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE TemplateHaskell #-}
{-# OPTIONS_GHC -fno-warn-type-defaults #-}

module Update
  ( addPatched,
    assertNotUpdatedOn,
    cveAll,
    cveReport,
    prMessage,
    sourceGithubAll,
    updatePackage,
  )
where

import CVE (CVE, cveID, cveLI)
import qualified Check
import Control.Exception (bracket)
import Control.Monad.Writer (execWriterT, tell)
import Data.Maybe (fromJust)
import Data.Monoid (Alt (..))
import qualified Data.Set as S
import qualified Data.Text as T
import qualified Data.Text.IO as T
import Data.Time.Calendar (showGregorian)
import Data.Time.Clock (getCurrentTime, utctDay)
import qualified Data.Vector as V
import qualified GH
import qualified Git
import NVD (getCVEs, withVulnDB)
import qualified Nix
import qualified NixpkgsReview
import OurPrelude
import qualified Outpaths
import qualified Rewrite
import qualified Skiplist
import System.Directory (doesDirectoryExist, withCurrentDirectory)
import System.Posix.Directory (createDirectory)
import Utils
  ( Boundary (..),
    Options (..),
    UpdateEnv (..),
    VersionMatcher (..),
    branchName,
    logDir,
    parseUpdates,
    prTitle,
    whenBatch,
  )
import qualified Utils as U
import qualified Version
import Prelude hiding (log)

default (T.Text)

alsoLogToAttrPath :: Text -> (Text -> IO ()) -> Text -> IO (Text -> IO ())
alsoLogToAttrPath attrPath topLevelLog url = do
  logFile <- attrPathLogFilePath attrPath
  T.appendFile logFile $ "Running nixpkgs-update (" <> url <> ") with UPDATE_INFO: "
  let attrPathLog = log' logFile
  return \text -> do
    topLevelLog text
    attrPathLog text

log' :: MonadIO m => FilePath -> Text -> m ()
log' logFile msg = liftIO $ T.appendFile logFile (msg <> "\n")

attrPathLogFilePath :: Text -> IO String
attrPathLogFilePath attrPath = do
  lDir <- logDir
  now <- getCurrentTime
  let dir = lDir <> "/" <> T.unpack attrPath
  dirExists <- doesDirectoryExist dir
  unless
    dirExists
    (createDirectory dir U.regDirMode)
  let logFile = dir <> "/" <> showGregorian (utctDay now) <> ".log"
  putStrLn ("For attrpath " <> T.unpack attrPath <> ", using log file: " <> logFile)
  return logFile

notifyOptions :: (Text -> IO ()) -> Options -> IO ()
notifyOptions log o = do
  let repr f = if f o then "YES" else "NO"
  let ghUser = GH.untagName . githubUser $ o
  let pr = repr doPR
  let batch = repr batchUpdate
  let outpaths = repr calculateOutpaths
  let cve = repr makeCVEReport
  let review = repr runNixpkgsReview
  let exactAttrPath = repr U.attrpath
  npDir <- tshow <$> Git.nixpkgsDir
  log $
    [interpolate| [options] github_user: $ghUser, pull_request: $pr, batch_update: $batch, calculate_outpaths: $outpaths, cve_report: $cve, nixpkgs-review: $review, nixpkgs_dir: $npDir, use attrpath: $exactAttrPath|]

cveAll :: Options -> Text -> IO ()
cveAll o updates = do
  let u' = rights $ parseUpdates updates
  results <-
    mapM
      ( \(p, oldV, newV, url) -> do
          r <- cveReport (UpdateEnv p oldV newV url o)
          return $ p <> ": " <> oldV <> " -> " <> newV <> "\n" <> r
      )
      u'
  T.putStrLn (T.unlines results)

sourceGithubAll :: Options -> Text -> IO ()
sourceGithubAll o updates = do
  let u' = rights $ parseUpdates updates
  _ <-
    runExceptT $ do
      Git.fetchIfStale <|> liftIO (T.putStrLn "Failed to fetch.")
      Git.cleanAndResetTo "master"
  mapM_
    ( \(p, oldV, newV, url) -> do
        let updateEnv = UpdateEnv p oldV newV url o
        runExceptT $ do
          attrPath <- Nix.lookupAttrPath updateEnv
          srcUrl <- Nix.getSrcUrl attrPath
          v <- GH.latestVersion updateEnv srcUrl
          if v /= newV
            then
              liftIO $
                T.putStrLn $
                  p <> ": " <> oldV <> " -> " <> newV <> " -> " <> v
            else return ()
    )
    u'

data UpdatePackageResult = UpdatePackageSuccess | UpdatePackageFailure

-- Arguments this function should have to make it testable:
-- - the merge base commit (should be updated externally to this function)
-- - the commit for branches: master, staging, staging-next, staging-nixos
updatePackageBatch ::
  (Text -> IO ()) ->
  Text ->
  UpdateEnv ->
  IO UpdatePackageResult
updatePackageBatch simpleLog updateInfoLine updateEnv@UpdateEnv {..} = do
  eitherFailureOrAttrpath <- runExceptT $ do
    -- Filters that don't need git
    whenBatch updateEnv do
      Skiplist.packageName packageName
      -- Update our git checkout
      Git.fetchIfStale <|> liftIO (T.putStrLn "Failed to fetch.")

    -- Filters: various cases where we shouldn't update the package
    if attrpath options
      then return packageName
      else Nix.lookupAttrPath updateEnv
  let url =
        if isBot updateEnv
          then "https://nix-community.org/update-bot/"
          else "https://github.com/nix-community/nixpkgs-update"
  case eitherFailureOrAttrpath of
    Left failure -> do
      simpleLog failure
      return UpdatePackageFailure
    Right foundAttrPath -> do
      log <- alsoLogToAttrPath foundAttrPath simpleLog url
      log updateInfoLine
      mergeBase <-
        if batchUpdate options
          then Git.mergeBase
          else pure "HEAD"
      withWorktree mergeBase foundAttrPath updateEnv $
        updateAttrPath log mergeBase updateEnv foundAttrPath

checkExistingUpdate ::
  (Text -> IO ()) ->
  UpdateEnv ->
  Maybe Text ->
  Text ->
  ExceptT Text IO ()
checkExistingUpdate log updateEnv existingCommitMsg attrPath = do
  case existingCommitMsg of
    Nothing -> lift $ log "No auto update branch exists"
    Just msg -> do
      let nV = newVersion updateEnv
      lift $
        log
          [interpolate|An auto update branch exists with message `$msg`. New version is $nV.|]

      case U.titleVersion msg of
        Just branchV
          | Version.matchVersion (RangeMatcher (Including nV) Unbounded) branchV ->
              throwError "An auto update branch exists with an equal or greater version"
        _ ->
          lift $ log "The auto update branch does not match or exceed the new version."

  -- Note that this check looks for PRs with the same old and new
  -- version numbers, so it won't stop us from updating an existing PR
  -- if this run updates the package to a newer version.
  GH.checkExistingUpdatePR updateEnv attrPath

updateAttrPath ::
  (Text -> IO ()) ->
  Text ->
  UpdateEnv ->
  Text ->
  IO UpdatePackageResult
updateAttrPath log mergeBase updateEnv@UpdateEnv {..} attrPath = do
  log $ "attrpath: " <> attrPath
  let pr = doPR options

  successOrFailure <- runExceptT $ do
    hasUpdateScript <- Nix.hasUpdateScript attrPath

    existingCommitMsg <- fmap getAlt . execWriterT $
      whenBatch updateEnv do
        Skiplist.attrPath attrPath
        when pr do
          liftIO $ log "Checking auto update branch..."
          mbLastCommitMsg <- lift $ Git.findAutoUpdateBranchMessage packageName
          tell $ Alt mbLastCommitMsg
          unless hasUpdateScript do
            lift $ checkExistingUpdate log updateEnv mbLastCommitMsg attrPath

    unless hasUpdateScript do
      Nix.assertNewerVersion updateEnv
      Version.assertCompatibleWithPathPin updateEnv attrPath

    let skipOutpathBase = either Just (const Nothing) $ Skiplist.skipOutpathCalc packageName

    derivationFile <- Nix.getDerivationFile attrPath
    unless hasUpdateScript do
      assertNotUpdatedOn updateEnv derivationFile "master"
      assertNotUpdatedOn updateEnv derivationFile "staging"
      assertNotUpdatedOn updateEnv derivationFile "staging-next"
      assertNotUpdatedOn updateEnv derivationFile "staging-nixos"

    -- Calculate output paths for rebuilds and our merge base
    let calcOutpaths = calculateOutpaths options && isNothing skipOutpathBase
    mergeBaseOutpathSet <-
      if calcOutpaths
        then Outpaths.currentOutpathSet
        else return $ Outpaths.dummyOutpathSetBefore attrPath

    -- Get the original values for diffing purposes
    derivationContents <- liftIO $ T.readFile $ T.unpack derivationFile
    oldHash <- Nix.getHash attrPath <|> pure ""
    oldSrcUrl <- Nix.getSrcUrl attrPath <|> pure ""
    oldRev <- Nix.getAttrString "rev" attrPath <|> pure ""
    oldVerMay <- rightMay `fmapRT` (lift $ runExceptT $ Nix.getAttrString "version" attrPath)

    tryAssert
      "The derivation has no 'version' attribute, so do not know how to figure out the version while doing an updateScript update"
      (not hasUpdateScript || isJust oldVerMay)

    -- One final filter
    Skiplist.content derivationContents

    ----------------------------------------------------------------------------
    -- UPDATES
    --
    -- At this point, we've stashed the old derivation contents and
    -- validated that we actually should be rewriting something. Get
    -- to work processing the various rewrite functions!
    rewriteMsgs <- Rewrite.runAll log Rewrite.Args {derivationFile = T.unpack derivationFile, ..}
    ----------------------------------------------------------------------------

    -- Compute the diff and get updated values
    diffAfterRewrites <- Git.diff mergeBase
    tryAssert
      "The diff was empty after rewrites."
      (diffAfterRewrites /= T.empty)
    lift . log $ "Diff after rewrites:\n" <> diffAfterRewrites
    updatedDerivationContents <- liftIO $ T.readFile $ T.unpack derivationFile
    newSrcUrl <- Nix.getSrcUrl attrPath <|> pure ""
    newHash <- Nix.getHash attrPath <|> pure ""
    newRev <- Nix.getAttrString "rev" attrPath <|> pure ""
    newVerMay <- rightMay `fmapRT` (lift $ runExceptT $ Nix.getAttrString "version" attrPath)

    tryAssert
      "The derivation has no 'version' attribute, so do not know how to figure out the version while doing an updateScript update"
      (not hasUpdateScript || isJust newVerMay)

    -- Sanity checks to make sure the PR is worth opening
    unless hasUpdateScript do
      when (derivationContents == updatedDerivationContents) $ throwE "No rewrites performed on derivation."
      when (oldSrcUrl /= "" && oldSrcUrl == newSrcUrl) $ throwE "Source url did not change. "
      when (oldHash /= "" && oldHash == newHash) $ throwE "Hashes equal; no update necessary"
      when (oldRev /= "" && oldRev == newRev) $ throwE "rev equal; no update necessary"

    --
    -- Update updateEnv if using updateScript
    updateEnv' <-
      if hasUpdateScript
        then do
          -- Already checked that these are Just above.
          let oldVer = fromJust oldVerMay
          let newVer = fromJust newVerMay

          -- Some update scripts make file changes but don't update the package
          -- version; ignore these updates (#388)
          when (newVer == oldVer) $ throwE "Package version did not change."

          return $
            UpdateEnv
              packageName
              oldVer
              newVer
              (Just "passthru.updateScript")
              options
        else return updateEnv

    whenBatch updateEnv do
      when pr do
        when hasUpdateScript do
          checkExistingUpdate log updateEnv' existingCommitMsg attrPath

    when hasUpdateScript do
      changedFiles <- Git.diffFileNames mergeBase
      let rewrittenFile = case changedFiles of [f] -> f; _ -> derivationFile
      assertNotUpdatedOn updateEnv' rewrittenFile "master"
      assertNotUpdatedOn updateEnv' rewrittenFile "staging"
      assertNotUpdatedOn updateEnv' rewrittenFile "staging-next"
      assertNotUpdatedOn updateEnv' rewrittenFile "staging-nixos"

    --
    -- Outpaths
    -- this sections is very slow
    editedOutpathSet <- if calcOutpaths then Outpaths.currentOutpathSetUncached else return $ Outpaths.dummyOutpathSetAfter attrPath
    let opDiff = S.difference mergeBaseOutpathSet editedOutpathSet
    let numPRebuilds = Outpaths.numPackageRebuilds opDiff
    whenBatch updateEnv do
      Skiplist.python numPRebuilds derivationContents
    when (numPRebuilds == 0) (throwE "Update edits cause no rebuilds.")
    -- end outpaths section

    Nix.build attrPath

    --
    -- Publish the result
    lift . log $ "Successfully finished processing"
    result <- Nix.resultLink
    let opReport =
          if isJust skipOutpathBase
            then "Outpath calculations were skipped for this package; total number of rebuilds unknown."
            else Outpaths.outpathReport opDiff
    let prBase =
          flip
            fromMaybe
            skipOutpathBase
            if Outpaths.numPackageRebuilds opDiff <= 500
              then
                if any (T.isInfixOf "nixosTests.simple") (V.toList $ Outpaths.packageRebuilds opDiff)
                  then "staging-nixos"
                  else "master"
              else "staging"
    publishPackage log updateEnv' oldSrcUrl newSrcUrl attrPath result opReport prBase rewriteMsgs (isJust existingCommitMsg)

  case successOrFailure of
    Left failure -> do
      log failure
      return UpdatePackageFailure
    Right () -> return UpdatePackageSuccess

publishPackage ::
  (Text -> IO ()) ->
  UpdateEnv ->
  Text ->
  Text ->
  Text ->
  Text ->
  Text ->
  Text ->
  [Text] ->
  Bool ->
  ExceptT Text IO ()
publishPackage log updateEnv oldSrcUrl newSrcUrl attrPath result opReport prBase rewriteMsgs branchExists = do
  cacheTestInstructions <- doCache log updateEnv result
  resultCheckReport <-
    case Skiplist.checkResult (packageName updateEnv) of
      Right () -> lift $ Check.result updateEnv (T.unpack result)
      Left msg -> pure msg
  metaDescription <- Nix.getDescription attrPath <|> return T.empty
  metaHomepage <- Nix.getHomepage attrPath <|> return T.empty
  metaChangelog <- Nix.getChangelog attrPath <|> return T.empty
  cveRep <- liftIO $ cveReport updateEnv
  releaseUrl <- GH.releaseUrl updateEnv newSrcUrl <|> return ""
  compareUrl <- GH.compareUrl oldSrcUrl newSrcUrl <|> return ""
  maintainers <- Nix.getMaintainers attrPath
  let commitMsg = commitMessage updateEnv attrPath
  Git.commit commitMsg
  commitRev <- Git.headRev
  nixpkgsReviewMsg <-
    if prBase /= "staging" && (runNixpkgsReview . options $ updateEnv)
      then liftIO $ NixpkgsReview.runReport log commitRev
      else return ""
  -- Try to push it three times
  -- (these pushes use --force, so it doesn't matter if branchExists is True)
  when
    (doPR . options $ updateEnv)
    (Git.push updateEnv <|> Git.push updateEnv <|> Git.push updateEnv)
  let prMsg =
        prMessage
          updateEnv
          metaDescription
          metaHomepage
          metaChangelog
          rewriteMsgs
          releaseUrl
          compareUrl
          resultCheckReport
          commitRev
          attrPath
          maintainers
          result
          opReport
          cveRep
          cacheTestInstructions
          nixpkgsReviewMsg
  liftIO $ log prMsg
  if (doPR . options $ updateEnv)
    then do
      let ghUser = GH.untagName . githubUser . options $ updateEnv
      let mkPR = if branchExists then GH.prUpdate else GH.pr
      (reusedPR, pullRequestUrl) <- mkPR updateEnv (prTitle updateEnv attrPath) prMsg (ghUser <> ":" <> (branchName updateEnv)) prBase
      when branchExists $
        liftIO $
          log
            if reusedPR
              then "Updated existing PR"
              else "Reused existing auto update branch, but no corresponding open PR was found, so created a new PR"
      liftIO $ log pullRequestUrl
    else liftIO $ T.putStrLn prMsg

commitMessage :: UpdateEnv -> Text -> Text
commitMessage updateEnv attrPath = prTitle updateEnv attrPath

prMessage ::
  UpdateEnv ->
  Text ->
  Text ->
  Text ->
  [Text] ->
  Text ->
  Text ->
  Text ->
  Text ->
  Text ->
  Text ->
  Text ->
  Text ->
  Text ->
  Text ->
  Text ->
  Text
prMessage updateEnv metaDescription metaHomepage metaChangelog rewriteMsgs releaseUrl compareUrl resultCheckReport commitRev attrPath maintainers resultPath opReport cveRep cacheTestInstructions nixpkgsReviewMsg =
  -- Some components of the PR description are pre-generated prior to calling
  -- because they require IO, but in general try to put as much as possible for
  -- the formatting into the pure function so that we can control the body
  -- formatting in one place and unit test it.
  let metaHomepageLine =
        if metaHomepage == T.empty
          then ""
          else "meta.homepage for " <> attrPath <> " is: " <> metaHomepage
      metaDescriptionLine =
        if metaDescription == T.empty
          then ""
          else "meta.description for " <> attrPath <> " is: " <> metaDescription
      metaChangelogLine =
        if metaChangelog == T.empty
          then ""
          else "meta.changelog for " <> attrPath <> " is: " <> metaChangelog
      rewriteMsgsLine = foldl (\ms m -> ms <> T.pack "\n- " <> m) "\n###### Updates performed" rewriteMsgs
      maintainersCc =
        if not (T.null maintainers)
          then "cc " <> maintainers <> " for [testing](https://github.com/nix-community/nixpkgs-update/blob/main/doc/nixpkgs-maintainer-faq.md#r-ryantm-opened-a-pr-for-my-package-what-do-i-do)."
          else ""
      releaseUrlMessage =
        if releaseUrl == T.empty
          then ""
          else "- [Release on GitHub](" <> releaseUrl <> ")"
      compareUrlMessage =
        if compareUrl == T.empty
          then ""
          else "- [Compare changes on GitHub](" <> compareUrl <> ")"
      nixpkgsReviewSection =
        if nixpkgsReviewMsg == T.empty
          then "Nixpkgs review skipped"
          else
            [interpolate|
            We have automatically built all packages that will get rebuilt due to
            this change.

            This gives evidence on whether the upgrade will break dependent packages.
            Note sometimes packages show up as _failed to build_ independent of the
            change, simply because they are already broken on the target branch.

            $nixpkgsReviewMsg
            |]
      pat link = [interpolate|This update was made based on information from $link.|]
      sourceLinkInfo = maybe "" pat $ sourceURL updateEnv
      ghUser = GH.untagName . githubUser . options $ updateEnv
      batch = batchUpdate . options $ updateEnv
      automatic = if batch then "Automatic" else "Semi-automatic"
   in [interpolate|
       $automatic update generated by [nixpkgs-update](https://github.com/nix-community/nixpkgs-update) tools. $sourceLinkInfo

       $metaDescriptionLine

       $metaHomepageLine

       $metaChangelogLine

       $rewriteMsgsLine

       ###### To inspect upstream changes

       $releaseUrlMessage

       $compareUrlMessage

       ###### Impact

       <b>Checks done</b>

       ---

       - built on NixOS
       $resultCheckReport

       ---

       <details>
       <summary>
       <b>Rebuild report</b> (if merged into master) (click to expand)
       </summary>

       ```
       $opReport
       ```

       </details>

       <details>
       <summary>
       <b>Instructions to test this update</b> (click to expand)
       </summary>

       ---

       $cacheTestInstructions
       ```
       nix-build -A $attrPath https://github.com/$ghUser/nixpkgs/archive/$commitRev.tar.gz
       ```
       Or:
       ```
       nix build github:$ghUser/nixpkgs/$commitRev#$attrPath
       ```

       After you've downloaded or built it, look at the files and if there are any, run the binaries:
       ```
       ls -la $resultPath
       ls -la $resultPath/bin
       ```

       ---

       </details>
       <br/>

       $cveRep

       ### Pre-merge build results

       $nixpkgsReviewSection

       ---

       ###### Maintainer pings

       $maintainersCc

       > [!TIP]
       > As a maintainer, if your package is located under `pkgs/by-name/*`, you can comment **`@NixOS/nixpkgs-merge-bot merge`** to automatically merge this update using the [`nixpkgs-merge-bot`](https://github.com/NixOS/nixpkgs/blob/master/ci/README.md#nixpkgs-merge-bot).
    |]

assertNotUpdatedOn ::
  MonadIO m => UpdateEnv -> Text -> Text -> ExceptT Text m ()
assertNotUpdatedOn updateEnv derivationFile branch = do
  derivationContents <- Git.show branch derivationFile
  Nix.assertOldVersionOn updateEnv branch derivationContents

addPatched :: Text -> Set CVE -> IO [(CVE, Bool)]
addPatched attrPath set = do
  let list = S.toList set
  forM
    list
    ( \cve -> do
        patched <- runExceptT $ Nix.hasPatchNamed attrPath (cveID cve)
        let p =
              case patched of
                Left _ -> False
                Right r -> r
        return (cve, p)
    )

cveReport :: UpdateEnv -> IO Text
cveReport updateEnv =
  if not (makeCVEReport . options $ updateEnv)
    then return ""
    else withVulnDB $ \conn -> do
      let pname1 = packageName updateEnv
      let pname2 = T.replace "-" "_" pname1
      oldCVEs1 <- getCVEs conn pname1 (oldVersion updateEnv)
      oldCVEs2 <- getCVEs conn pname2 (oldVersion updateEnv)
      let oldCVEs = S.fromList (oldCVEs1 ++ oldCVEs2)
      newCVEs1 <- getCVEs conn pname1 (newVersion updateEnv)
      newCVEs2 <- getCVEs conn pname2 (newVersion updateEnv)
      let newCVEs = S.fromList (newCVEs1 ++ newCVEs2)
      let inOldButNotNew = S.difference oldCVEs newCVEs
          inNewButNotOld = S.difference newCVEs oldCVEs
          inBoth = S.intersection oldCVEs newCVEs
          ifEmptyNone t =
            if t == T.empty
              then "none"
              else t
      inOldButNotNew' <- addPatched (packageName updateEnv) inOldButNotNew
      inNewButNotOld' <- addPatched (packageName updateEnv) inNewButNotOld
      inBoth' <- addPatched (packageName updateEnv) inBoth
      let toMkdownList = fmap (uncurry cveLI) >>> T.unlines >>> ifEmptyNone
          fixedList = toMkdownList inOldButNotNew'
          newList = toMkdownList inNewButNotOld'
          unresolvedList = toMkdownList inBoth'
      if fixedList == "none" && unresolvedList == "none" && newList == "none"
        then return ""
        else
          return
            [interpolate|
      ###### Security vulnerability report

      <details>
      <summary>
      Security report (click to expand)
      </summary>

      CVEs resolved by this update:
      $fixedList

      CVEs introduced by this update:
      $newList

      CVEs present in both versions:
      $unresolvedList


       </details>
       <br/>
      |]

isBot :: UpdateEnv -> Bool
isBot updateEnv =
  let o = options updateEnv
   in batchUpdate o && "r-ryantm" == (GH.untagName $ githubUser o)

doCache :: MonadIO m => (Text -> m ()) -> UpdateEnv -> Text -> ExceptT Text m Text
doCache log updateEnv resultPath =
  if isBot updateEnv
    then do
      return
        [interpolate|
       Either **download from the cache**:
       ```
       nix-store -r $resultPath \
         --option binary-caches 'https://cache.nixos.org/ https://nixpkgs-update-cache.nix-community.org/' \
         --option trusted-public-keys '
         nixpkgs-update-cache.nix-community.org-1:U8d6wiQecHUPJFSqHN9GSSmNkmdiFW7GW7WNAnHW0SM=
         cache.nixos.org-1:6NCHdD59X431o0gWypbMrAURkbJ16ZPMQFGspcDShjY=
         '
       ```
       (The nixpkgs-update cache is only trusted for this store-path realization.)
       For the cached download to work, your user must be in the `trusted-users` list or you can use `sudo` since root is effectively trusted.

       Or, **build yourself**:
       |]
    else do
      lift $ log "skipping cache"
      return "Build yourself:"

updatePackage ::
  Options ->
  Text ->
  IO ()
updatePackage o updateInfo = do
  let (p, oldV, newV, url) = head (rights (parseUpdates updateInfo))
  let updateInfoLine = (p <> " " <> oldV <> " -> " <> newV <> fromMaybe "" (fmap (" " <>) url))
  let updateEnv = UpdateEnv p oldV newV url o
  let log = T.putStrLn
  liftIO $ notifyOptions log o
  updated <- updatePackageBatch log updateInfoLine updateEnv
  case updated of
    UpdatePackageFailure -> do
      log $ "[result] Failed to update " <> updateInfoLine
    UpdatePackageSuccess -> do
      log $ "[result] Success updating " <> updateInfoLine

withWorktree :: Text -> Text -> UpdateEnv -> IO a -> IO a
withWorktree branch attrpath updateEnv action = do
  bracket
    ( do
        dir <- U.worktreeDir
        let path = dir <> "/" <> T.unpack (T.replace ".lock" "_lock" attrpath)
        Git.worktreeRemove path
        Git.delete1 (branchName updateEnv)
        Git.worktreeAdd path branch updateEnv
        pure path
    )
    ( \path -> do
        Git.worktreeRemove path
        Git.delete1 (branchName updateEnv)
    )
    (\path -> withCurrentDirectory path action)


================================================
FILE: src/Utils.hs
================================================
{-# LANGUAGE ExtendedDefaultRules #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE QuasiQuotes #-}
{-# OPTIONS_GHC -fno-warn-type-defaults #-}

module Utils
  ( Boundary (..),
    Options (..),
    ProductID,
    URL,
    UpdateEnv (..),
    Version,
    VersionMatcher (..),
    branchName,
    branchPrefix,
    getGithubToken,
    getGithubUser,
    logDir,
    nixBuildOptions,
    nixCommonOptions,
    parseUpdates,
    prTitle,
    runLog,
    srcOrMain,
    titleVersion,
    whenBatch,
    regDirMode,
    outpathCacheDir,
    cacheDir,
    worktreeDir,
  )
where

import Data.Bits ((.|.))
import Data.Maybe (fromJust)
import qualified Data.Text as T
import qualified Data.Text.IO as T
import Database.SQLite.Simple (ResultError (..), SQLData (..))
import Database.SQLite.Simple.FromField
  ( FieldParser,
    FromField,
    fromField,
    returnError,
  )
import Database.SQLite.Simple.Internal (Field (..))
import Database.SQLite.Simple.Ok (Ok (..))
import Database.SQLite.Simple.ToField (ToField, toField)
import qualified GitHub as GH
import OurPrelude
import Polysemy.Output
import System.Directory (createDirectoryIfMissing, doesDirectoryExist)
import System.Environment (lookupEnv)
import System.Posix.Directory (createDirectory)
import System.Posix.Env (getEnv)
import System.Posix.Files
  ( directoryMode,
    fileExist,
    groupModes,
    otherExecuteMode,
    otherReadMode,
    ownerModes,
  )
import System.Posix.Temp (mkdtemp)
import System.Posix.Types (FileMode)
import Text.Read (readEither)
import Type.Reflection (Typeable)

default (T.Text)

type ProductID = Text

type Version = Text

type URL = Text

-- | The Ord instance is used to sort lists of matchers in order to compare them
-- as a set, it is not useful for comparing bounds since the ordering of bounds
-- depends on whether it is a start or end bound.
data Boundary a
  = Unbounded
  | Including a
  | Excluding a
  deriving (Eq, Ord, Show, Read)

-- | The Ord instance is used to sort lists of matchers in order to compare them
-- as a set, it is not useful for comparing versions.
data VersionMatcher
  = SingleMatcher Version
  | RangeMatcher (Boundary Version) (Boundary Version)
  deriving (Eq, Ord, Show, Read)

readField :: (Read a, Typeable a) => FieldParser a
readField f@(Field (SQLText t) _) =
  case readEither (T.unpack t) of
    Right x -> Ok x
    Left e -> returnError ConversionFailed f $ "read error: " <> e
readField f = returnError ConversionFailed f "expecting SQLText column type"

showField :: Show a => a -> SQLData
showField = toField . show

instance FromField VersionMatcher where
  fromField = readField

instance ToField VersionMatcher where
  toField = showField

data Options = Options
  { doPR :: Bool,
    batchUpdate :: Bool,
    githubUser :: GH.Name GH.Owner,
    githubToken :: Text,
    makeCVEReport :: Bool,
    runNixpkgsReview :: Bool,
    calculateOutpaths :: Bool,
    attrpath :: Bool
  }
  deriving (Show)

data UpdateEnv = UpdateEnv
  { packageName :: Text,
    oldVersion :: Version,
    newVersion :: Version,
    sourceURL :: Maybe URL,
    options :: Options
  }

whenBatch :: Applicative f => UpdateEnv -> f () -> f ()
whenBatch updateEnv = when (batchUpdate . options $ updateEnv)

prTitle :: UpdateEnv -> Text -> Text
prTitle updateEnv attrPath =
  let oV = oldVersion updateEnv
      nV = newVersion updateEnv
   in T.strip [interpolate| $attrPath: $oV -> $nV |]

titleVersion :: Text -> Maybe Version
titleVersion title = if T.null prefix then Nothing else Just suffix
  where
    (prefix, suffix) = T.breakOnEnd " -> " title

regDirMode :: FileMode
regDirMode =
  directoryMode
    .|. ownerModes
    .|. groupModes
    .|. otherReadMode
    .|. otherExecuteMode

logsDirectory :: MonadIO m => ExceptT Text m FilePath
logsDirectory = do
  dir <-
    noteT "Could not get environment variable LOGS_DIRECTORY" $
      MaybeT $
        liftIO $
          getEnv "LOGS_DIRECTORY"
  dirExists <- liftIO $ doesDirectoryExist dir
  tryAssert ("LOGS_DIRECTORY " <> T.pack dir <> " does not exist.") dirExists
  unless
    dirExists
    ( liftIO $
        putStrLn "creating xdgRuntimeDir" >> createDirectory dir regDirMode
    )
  return dir

cacheDir :: MonadIO m => m FilePath
cacheDir = do
  cacheDirectory <- liftIO $ lookupEnv "CACHE_DIRECTORY"
  xdgCacheHome <- liftIO $ fmap (fmap (\dir -> dir </> "nixpkgs-update")) $ lookupEnv "XDG_CACHE_HOME"
  cacheHome <- liftIO $ fmap (fmap (\dir -> dir </> ".cache/nixpkgs-update")) $ lookupEnv "HOME"
  let dir = fromJust (cacheDirectory <|> xdgCacheHome <|> cacheHome)
  liftIO $ createDirectoryIfMissing True dir
  return dir

outpathCacheDir :: MonadIO m => m FilePath
outpathCacheDir = do
  cache <- cacheDir
  let dir = cache </> "outpath"
  liftIO $ createDirectoryIfMissing False dir
  return dir

worktreeDir :: IO FilePath
worktreeDir = do
  cache <- cacheDir
  let dir = cache </> "worktree"
  createDirectoryIfMissing False dir
  return dir

xdgRuntimeDir :: MonadIO m => ExceptT Text m FilePath
xdgRuntimeDir = do
  xDir <-
    noteT "Could not get environment variable XDG_RUNTIME_DIR" $
      MaybeT $
        liftIO $
          getEnv "XDG_RUNTIME_DIR"
  xDirExists <- liftIO $ doesDirectoryExist xDir
  tryAssert ("XDG_RUNTIME_DIR " <> T.pack xDir <> " does not exist.") xDirExists
  let dir = xDir </> "nixpkgs-update"
  dirExists <- liftIO $ fileExist dir
  unless
    dirExists
    ( liftIO $
        putStrLn "creating xdgRuntimeDir" >> createDirectory dir regDirMode
    )
  return dir

tmpRuntimeDir :: MonadIO m => ExceptT Text m FilePath
tmpRuntimeDir = do
  dir <- liftIO $ mkdtemp "nixpkgs-update"
  dirExists <- liftIO $ doesDirectoryExist dir
  tryAssert
    ("Temporary directory " <> T.pack dir <> " does not exist.")
    dirExists
  return dir

logDir :: IO FilePath
logDir = do
  r <-
    runExceptT
      ( logsDirectory
          <|> xdgRuntimeDir
          <|> tmpRuntimeDir
          <|> throwE
            "Failed to create log directory."
      )
  case r of
    Right dir -> return dir
    Left e -> error $ T.unpack e

branchPrefix :: Text
branchPrefix = "auto-update/"

branchName :: UpdateEnv -> Text
branchName ue = branchPrefix <> packageName ue

parseUpdates :: Text -> [Either Text (Text, Version, Version, Maybe URL)]
parseUpdates = map (toTriple . T.words) . T.lines
  where
    toTriple :: [Text] -> Either Text (Text, Version, Version, Maybe URL)
    toTriple [package, oldVer, newVer] = Right (package, oldVer, newVer, Nothing)
    toTriple [package, oldVer, newVer, url] = Right (package, oldVer, newVer, Just url)
    toTriple line = Left $ "Unable to parse update: " <> T.unwords line

srcOrMain :: MonadIO m => (Text -> ExceptT Text m a) -> Text -> ExceptT Text m a
srcOrMain et attrPath = et (attrPath <> ".src") <|> et (attrPath <> ".originalSrc") <|> et attrPath

nixCommonOptions :: [String]
nixCommonOptions =
  [ "--arg",
    "config",
    "{ allowUnfree = true; allowAliases = false; }",
    "--arg",
    "overlays",
    "[ ]"
  ]

nixBuildOptions :: [String]
nixBuildOptions =
  [ "--option",
    "sandbox",
    "true"
  ]
    <> nixCommonOptions

runLog ::
  Member (Embed IO) r =>
  (Text -> IO ()) ->
  Sem ((Output Text) ': r) a ->
  Sem r a
runLog logger =
  interpret \case
    Output o -> embed $ logger o

envToken :: IO (Maybe Text)
envToken = fmap tshow <$> getEnv "GITHUB_TOKEN"

localToken :: IO (Maybe Text)
localToken = do
  exists <- fileExist "github_token.txt"
  if exists
    then (Just . T.strip <$> T.readFile "github_token.txt")
    else (return Nothing)

hubFileLocation :: IO (Maybe FilePath)
hubFileLocation = do
  xloc <- fmap (</> "hub") <$> getEnv "XDG_CONFIG_HOME"
  hloc <- fmap (</> ".config/hub") <$> getEnv "HOME"
  return (xloc <|> hloc)

hubConfigField :: Text -> IO (Maybe Text)
hubConfigField field = do
  hubFile <- hubFileLocation
  case hubFile of
    Nothing -> return Nothing
    Just file -> do
      exists <- fileExist file
      if not exists
        then return Nothing
        else d
Download .txt
gitextract_rca3ac9l/

├── .github/
│   ├── CONTRIBUTING.md
│   ├── FUNDING.yml
│   ├── dependabot.yml
│   └── workflows/
│       ├── doc.yaml
│       └── flake-updates.yml
├── .gitignore
├── CVENOTES.org
├── LICENSE
├── README.md
├── app/
│   └── Main.hs
├── doc/
│   ├── batch-updates.md
│   ├── contact.md
│   ├── contributing.md
│   ├── details.md
│   ├── donate.md
│   ├── installation.md
│   ├── interactive-updates.md
│   ├── introduction.md
│   ├── nixpkgs-maintainer-faq.md
│   ├── nixpkgs-update.md
│   ├── nu.md
│   ├── r-ryantm.md
│   └── toc.md
├── flake.nix
├── nixpkgs-update.cabal
├── nixpkgs-update.nix
├── package.yaml
├── pkgs/
│   └── default.nix
├── rust/
│   ├── .envrc
│   ├── .gitignore
│   ├── Cargo.toml
│   ├── diesel.toml
│   ├── flake.nix
│   ├── migrations/
│   │   └── 2023-08-12-152848_create_packages/
│   │       ├── down.sql
│   │       └── up.sql
│   └── src/
│       ├── github.rs
│       ├── lib.rs
│       ├── main.rs
│       ├── models.rs
│       ├── nix.rs
│       ├── repology.rs
│       └── schema.rs
├── src/
│   ├── CVE.hs
│   ├── Check.hs
│   ├── Data/
│   │   └── Hex.hs
│   ├── DeleteMerged.hs
│   ├── File.hs
│   ├── GH.hs
│   ├── Git.hs
│   ├── NVD.hs
│   ├── NVDRules.hs
│   ├── Nix.hs
│   ├── NixpkgsReview.hs
│   ├── OurPrelude.hs
│   ├── Outpaths.hs
│   ├── Process.hs
│   ├── Repology.hs
│   ├── Rewrite.hs
│   ├── Skiplist.hs
│   ├── Update.hs
│   ├── Utils.hs
│   └── Version.hs
├── test/
│   ├── CheckSpec.hs
│   ├── DoctestSpec.hs
│   ├── Spec.hs
│   ├── UpdateSpec.hs
│   └── UtilsSpec.hs
└── test_data/
    ├── expected_pr_description_1.md
    ├── expected_pr_description_2.md
    ├── quoted_homepage_bad.nix
    └── quoted_homepage_good.nix
Download .txt
SYMBOL INDEX (14 symbols across 7 files)

FILE: rust/migrations/2023-08-12-152848_create_packages/up.sql
  type packages (line 1) | CREATE TABLE packages (

FILE: rust/src/github.rs
  function token (line 3) | fn token() -> Option<String> {
  function latest_release (line 13) | pub fn latest_release(github: &Github) -> Result<json::JsonValue, &'stat...
  type Github (line 32) | pub struct Github {
  function from (line 37) | pub fn from(attr_path: &String) -> Option<Github> {

FILE: rust/src/lib.rs
  function establish_connection (line 7) | pub fn establish_connection() -> SqliteConnection {

FILE: rust/src/main.rs
  function version_in_nixpkgs_branch (line 10) | fn version_in_nixpkgs_branch(branch: &str, attr_path: &String) -> Option...
  function version_in_nixpkgs_master (line 14) | fn version_in_nixpkgs_master(attr_path: &String) -> Option<String> {
  function version_in_nixpkgs_staging (line 18) | fn version_in_nixpkgs_staging(attr_path: &String) -> Option<String> {
  function version_in_nixpkgs_staging_next (line 22) | fn version_in_nixpkgs_staging_next(attr_path: &String) -> Option<String> {
  function main (line 26) | fn main() {

FILE: rust/src/models.rs
  type Package (line 7) | pub struct Package {

FILE: rust/src/nix.rs
  function eval (line 3) | pub fn eval(branch: &str, attr_path: &String, apply: &str) -> Option<Str...

FILE: rust/src/repology.rs
  function latest_version (line 1) | pub fn latest_version(project_name: &String) -> Result<json::JsonValue, ...
Condensed preview — 71 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (225K chars).
[
  {
    "path": ".github/CONTRIBUTING.md",
    "chars": 368,
    "preview": "Thank you for your interest in contributing to nixpkgs-update.\n\n# Licensing\n\nPlease indicate you license your contributi"
  },
  {
    "path": ".github/FUNDING.yml",
    "chars": 38,
    "preview": "github: ryantm\npatreon: nixpkgsupdate\n"
  },
  {
    "path": ".github/dependabot.yml",
    "chars": 118,
    "preview": "version: 2\nupdates:\n  - package-ecosystem: \"github-actions\"\n    directory: \"/\"\n    schedule:\n      interval: \"weekly\"\n"
  },
  {
    "path": ".github/workflows/doc.yaml",
    "chars": 1013,
    "preview": "name: doc\n\non:\n  push:\n    branches:\n      - main\n  workflow_dispatch:\n\npermissions:\n  contents: read\n  pages: write\n  i"
  },
  {
    "path": ".github/workflows/flake-updates.yml",
    "chars": 438,
    "preview": "name: \"Update flakes\"\non:\n  workflow_dispatch:\n  schedule:\n    - cron: \"0 0 1 * *\"\njobs:\n  createPullRequest:\n    runs-o"
  },
  {
    "path": ".gitignore",
    "chars": 115,
    "preview": ".ghc*\n/github_token.txt\n/packages-to-update.txt\n/result\n/result-doc\ndist-newstyle/\ndist/\ntest_data/actual*\n.direnv\n"
  },
  {
    "path": "CVENOTES.org",
    "chars": 1596,
    "preview": "* Issues\n** https://github.com/NixOS/nixpkgs/pull/74184#issuecomment-565891652\n* Fixed\n** uzbl: 0.9.0 -> 0.9.1\n  - [[htt"
  },
  {
    "path": "LICENSE",
    "chars": 7047,
    "preview": "Creative Commons Legal Code\n\nCC0 1.0 Universal\n\n    CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE\n"
  },
  {
    "path": "README.md",
    "chars": 207,
    "preview": "# nixpkgs-update\n\n[![Patreon](https://img.shields.io/badge/patreon-donate-blue.svg)](https://www.patreon.com/nixpkgsupda"
  },
  {
    "path": "app/Main.hs",
    "chars": 6154,
    "preview": "{-# LANGUAGE ExtendedDefaultRules #-}\n{-# LANGUAGE NamedFieldPuns #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# OPTIONS_GHC"
  },
  {
    "path": "doc/batch-updates.md",
    "chars": 2053,
    "preview": "# Batch updates {#batch-updates}\n\nnixpkgs-update supports batch updates via the `update-list`\nsubcommand.\n\n## Update-Lis"
  },
  {
    "path": "doc/contact.md",
    "chars": 233,
    "preview": "# Contact {#contact}\n\nGithub: [https://github.com/nix-community/nixpkgs-update](https://github.com/nix-community/nixpkgs"
  },
  {
    "path": "doc/contributing.md",
    "chars": 796,
    "preview": "# Contributing {#contributing}\n\nIncremental development:\n\n```bash\nnix-shell --run \"cabal v2-repl\"\n```\n\nRun the tests:\n\n`"
  },
  {
    "path": "doc/details.md",
    "chars": 2929,
    "preview": "# Details {#details}\n\nSome of these features only apply to the update-list sub-command or to\nfeatures only available to "
  },
  {
    "path": "doc/donate.md",
    "chars": 346,
    "preview": "# Donate {#donate}\n\n[@r-ryantm](https://github.com/r-ryantm), the bot that updates Nixpkgs, is currently running on a He"
  },
  {
    "path": "doc/installation.md",
    "chars": 2043,
    "preview": "# Installation {#installation}\n\n::: note\nFor the Cachix cache to work, your user must be in the trusted-users\nlist or yo"
  },
  {
    "path": "doc/interactive-updates.md",
    "chars": 1538,
    "preview": "# Interactive updates {#interactive-updates}\n\nnixpkgs-update supports interactive, single package updates via the\n`updat"
  },
  {
    "path": "doc/introduction.md",
    "chars": 1007,
    "preview": "# nixpkgs-update {#introduction}\n\n> The future is here; let's evenly distribute it!\n\nThe [nixpkgs-update](https://github"
  },
  {
    "path": "doc/nixpkgs-maintainer-faq.md",
    "chars": 4013,
    "preview": "# Nixpkgs Maintainer FAQ {#nixpkgs-maintainer-faq}\n\n## @r-ryantm opened a PR for my package, what do I do?\n\nThanks for b"
  },
  {
    "path": "doc/nixpkgs-update.md",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "doc/nu.md",
    "chars": 851,
    "preview": "# nu\n\n## Schema\n\n## Table `package`\n\n- id : int\n- attrPath : string\n- versionNixpkgsMaster : string\n- versionNixpkgsStag"
  },
  {
    "path": "doc/r-ryantm.md",
    "chars": 254,
    "preview": "# r-ryantm bot {#r-ryantm}\n\n[@r-ryantm](https://github.com/r-ryantm), is a bot account that updates Nixpkgs by making PR"
  },
  {
    "path": "doc/toc.md",
    "chars": 338,
    "preview": "# nixpkgs-update\n\n* [Introduction](#introduction)\n* [Installation](#installation)\n* [Interactive updates](#interactive-u"
  },
  {
    "path": "flake.nix",
    "chars": 1883,
    "preview": "{\n  description = \"update nixpkgs automatically\";\n\n  inputs.mmdoc.url = \"github:ryantm/mmdoc\";\n  inputs.mmdoc.inputs.nix"
  },
  {
    "path": "nixpkgs-update.cabal",
    "chars": 5039,
    "preview": "cabal-version: 2.2\n\n-- This file has been generated from package.yaml by hpack version 0.38.2.\n--\n-- see: https://github"
  },
  {
    "path": "nixpkgs-update.nix",
    "chars": 2531,
    "preview": "{ mkDerivation, aeson, base, bytestring, conduit, containers\n, cryptohash-sha256, directory, doctest, errors, filepath, "
  },
  {
    "path": "package.yaml",
    "chars": 2074,
    "preview": "name: nixpkgs-update\nversion: 0.4.0\nsynopsis: Tool for semi-automatic updating of nixpkgs repository\ndescription: nixpkg"
  },
  {
    "path": "pkgs/default.nix",
    "chars": 2034,
    "preview": "{ nixpkgs\n, mmdoc\n, runtimeDeps\n, system\n, self\n, ...\n}:\n\nlet\n\n  runtimePkgs = import runtimeDeps { inherit system; };\n\n"
  },
  {
    "path": "rust/.envrc",
    "chars": 16,
    "preview": "use flake\ndotenv"
  },
  {
    "path": "rust/.gitignore",
    "chars": 24,
    "preview": "target\ndb.sqlite\n.direnv"
  },
  {
    "path": "rust/Cargo.toml",
    "chars": 198,
    "preview": "[package]\nname = \"nixpkgs-update\"\nversion = \"0.1.0\"\nedition = \"2021\"\n\n[dependencies]\nchrono = \"0.4.26\"\ndiesel = { versio"
  },
  {
    "path": "rust/diesel.toml",
    "chars": 244,
    "preview": "# For documentation on how to configure this file,\n# see https://diesel.rs/guides/configuring-diesel-cli\n\n[print_schema]"
  },
  {
    "path": "rust/flake.nix",
    "chars": 391,
    "preview": "{\n  description = \"update nixpkgs automatically\";\n\n  outputs = { self, nixpkgs } @ args: let\n    pkgs = nixpkgs.legacyPa"
  },
  {
    "path": "rust/migrations/2023-08-12-152848_create_packages/down.sql",
    "chars": 21,
    "preview": "DROP table packages;\n"
  },
  {
    "path": "rust/migrations/2023-08-12-152848_create_packages/up.sql",
    "chars": 897,
    "preview": "CREATE TABLE packages (\n    id TEXT PRIMARY KEY NOT NULL\n  , attr_path TEXT NOT NULL\n  , last_update_attempt DATETIME\n  "
  },
  {
    "path": "rust/src/github.rs",
    "chars": 1414,
    "preview": "use crate::nix;\n\nfn token() -> Option<String> {\n    if let Ok(token) = std::env::var(\"GH_TOKEN\") {\n        return Some(t"
  },
  {
    "path": "rust/src/lib.rs",
    "chars": 333,
    "preview": "pub mod models;\npub mod schema;\n\nuse diesel::prelude::*;\nuse std::env;\n\npub fn establish_connection() -> SqliteConnectio"
  },
  {
    "path": "rust/src/main.rs",
    "chars": 3239,
    "preview": "mod github;\nmod nix;\nmod repology;\n\nuse chrono::offset::Utc;\nuse diesel::prelude::*;\nuse nixpkgs_update::models::*;\nuse "
  },
  {
    "path": "rust/src/models.rs",
    "chars": 1522,
    "preview": "use chrono::NaiveDateTime;\nuse diesel::prelude::*;\n\n#[derive(Queryable, Selectable)]\n#[diesel(table_name = crate::schema"
  },
  {
    "path": "rust/src/nix.rs",
    "chars": 480,
    "preview": "use std::process::Command;\n\npub fn eval(branch: &str, attr_path: &String, apply: &str) -> Option<String> {\n    let outpu"
  },
  {
    "path": "rust/src/repology.rs",
    "chars": 710,
    "preview": "pub fn latest_version(project_name: &String) -> Result<json::JsonValue, &'static str> {\n    let body = ureq::get(&format"
  },
  {
    "path": "rust/src/schema.rs",
    "chars": 1442,
    "preview": "// @generated automatically by Diesel CLI.\n\ndiesel::table! {\n    packages (id) {\n        id -> Text,\n        attr_path -"
  },
  {
    "path": "src/CVE.hs",
    "chars": 9615,
    "preview": "{-# LANGUAGE NamedFieldPuns #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE RecordWildCards #-}\n\nmodule CVE\n  ( pars"
  },
  {
    "path": "src/Check.hs",
    "chars": 6211,
    "preview": "{-# LANGUAGE ExtendedDefaultRules #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE QuasiQuotes #-}\n{-# LANGUAGE Templ"
  },
  {
    "path": "src/Data/Hex.hs",
    "chars": 1858,
    "preview": "{-# LANGUAGE FlexibleInstances #-}\n{-# LANGUAGE TypeSynonymInstances #-}\n\n----------------------------------------------"
  },
  {
    "path": "src/DeleteMerged.hs",
    "chars": 824,
    "preview": "{-# LANGUAGE OverloadedStrings #-}\n\nmodule DeleteMerged\n  ( deleteDone,\n  )\nwhere\n\nimport qualified Data.Text.IO as T\nim"
  },
  {
    "path": "src/File.hs",
    "chars": 1338,
    "preview": "{-# LANGUAGE BlockArguments #-}\n{-# LANGUAGE LambdaCase #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE TemplateHask"
  },
  {
    "path": "src/GH.hs",
    "chars": 8877,
    "preview": "{-# LANGUAGE ExtendedDefaultRules #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE QuasiQuotes #-}\n{-# LANGUAGE Scope"
  },
  {
    "path": "src/Git.hs",
    "chars": 9070,
    "preview": "{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE TemplateHaskell #-}\n\nmodule Git\n  ( findAutoUpdateBranchMessage,\n    mer"
  },
  {
    "path": "src/NVD.hs",
    "chars": 9644,
    "preview": "{-# LANGUAGE NamedFieldPuns #-}\n{-# LANGUAGE OverloadedStrings #-}\n\nmodule NVD\n  ( withVulnDB,\n    getCVEs,\n    Connecti"
  },
  {
    "path": "src/NVDRules.hs",
    "chars": 2891,
    "preview": "{-# LANGUAGE OverloadedStrings #-}\n\nmodule NVDRules where\n\nimport CVE (CPE (..), CPEMatch (..), CVE (..))\nimport Data.Ch"
  },
  {
    "path": "src/Nix.hs",
    "chars": 9959,
    "preview": "{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE ScopedTypeVariables #-}\n{-# LANGUAGE TemplateHaskell #-}\n\nmodule Nix\n  ("
  },
  {
    "path": "src/NixpkgsReview.hs",
    "chars": 2257,
    "preview": "{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE TemplateHaskell #-}\n\nmodule NixpkgsReview\n  ( cacheDir,\n    runReport,\n "
  },
  {
    "path": "src/OurPrelude.hs",
    "chars": 3526,
    "preview": "{-# LANGUAGE PartialTypeSignatures #-}\n\nmodule OurPrelude\n  ( (>>>),\n    (<|>),\n    (<>),\n    (</>),\n    (<&>),\n    (&),"
  },
  {
    "path": "src/Outpaths.hs",
    "chars": 6829,
    "preview": "{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE QuasiQuotes #-}\n\nmodule Outpaths\n  ( currentOutpathSet,\n    currentOutpa"
  },
  {
    "path": "src/Process.hs",
    "chars": 1259,
    "preview": "{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE TemplateHaskell #-}\n\nmodule Process where\n\nimport qualified Data.ByteStr"
  },
  {
    "path": "src/Repology.hs",
    "chars": 6072,
    "preview": "{-# LANGUAGE DeriveAnyClass #-}\n{-# LANGUAGE DeriveGeneric #-}\n{-# LANGUAGE OverloadedStrings #-}\n\nmodule Repology where"
  },
  {
    "path": "src/Rewrite.hs",
    "chars": 11022,
    "preview": "{-# LANGUAGE MultiWayIf #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE RecordWildCards #-}\n{-# LANGUAGE ViewPattern"
  },
  {
    "path": "src/Skiplist.hs",
    "chars": 10526,
    "preview": "{-# LANGUAGE FlexibleContexts #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE RankNTypes #-}\n\nmodule Skiplist\n  ( pa"
  },
  {
    "path": "src/Update.hs",
    "chars": 24687,
    "preview": "{-# LANGUAGE ExtendedDefaultRules #-}\n{-# LANGUAGE FlexibleContexts #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE "
  },
  {
    "path": "src/Utils.hs",
    "chars": 8536,
    "preview": "{-# LANGUAGE ExtendedDefaultRules #-}\n{-# LANGUAGE OverloadedStrings #-}\n{-# LANGUAGE QuasiQuotes #-}\n{-# OPTIONS_GHC -f"
  },
  {
    "path": "src/Version.hs",
    "chars": 8130,
    "preview": "{-# LANGUAGE FlexibleInstances #-}\n{-# LANGUAGE NamedFieldPuns #-}\n{-# LANGUAGE OverloadedStrings #-}\n\nmodule Version\n  "
  },
  {
    "path": "test/CheckSpec.hs",
    "chars": 1795,
    "preview": "module CheckSpec where\n\nimport qualified Check\nimport qualified Data.Text as T\nimport Test.Hspec\n\nmain :: IO ()\nmain = h"
  },
  {
    "path": "test/DoctestSpec.hs",
    "chars": 717,
    "preview": "module DoctestSpec where\n\nimport Test.DocTest\nimport Test.Hspec\n\nmain :: IO ()\nmain = hspec spec\n\nspec :: Spec\nspec = do"
  },
  {
    "path": "test/Spec.hs",
    "chars": 44,
    "preview": "{-# OPTIONS_GHC -F -pgmF hspec-discover #-}\n"
  },
  {
    "path": "test/UpdateSpec.hs",
    "chars": 2451,
    "preview": "{-# LANGUAGE OverloadedStrings #-}\n\nmodule UpdateSpec where\n\nimport qualified Data.Text.IO as T\nimport Test.Hspec\nimport"
  },
  {
    "path": "test/UtilsSpec.hs",
    "chars": 988,
    "preview": "{-# LANGUAGE OverloadedStrings #-}\n\nmodule UtilsSpec where\n\nimport Test.Hspec\nimport qualified Utils\n\nmain :: IO ()\nmain"
  },
  {
    "path": "test_data/expected_pr_description_1.md",
    "chars": 2154,
    "preview": "Semi-automatic update generated by [nixpkgs-update](https://github.com/nix-community/nixpkgs-update) tools. This update "
  },
  {
    "path": "test_data/expected_pr_description_2.md",
    "chars": 1849,
    "preview": "Semi-automatic update generated by [nixpkgs-update](https://github.com/nix-community/nixpkgs-update) tools. This update "
  },
  {
    "path": "test_data/quoted_homepage_bad.nix",
    "chars": 698,
    "preview": "{ stdenv, fetchFromGitHub, autoreconfHook, pkgconfig\n, gnutls, libite, libconfuse }:\n\nstdenv.mkDerivation rec {\n  pname "
  },
  {
    "path": "test_data/quoted_homepage_good.nix",
    "chars": 700,
    "preview": "{ stdenv, fetchFromGitHub, autoreconfHook, pkgconfig\n, gnutls, libite, libconfuse }:\n\nstdenv.mkDerivation rec {\n  pname "
  }
]

About this extraction

This page contains the full source code of the ryantm/nixpkgs-update GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 71 files (207.5 KB), approximately 58.4k tokens, and a symbol index with 14 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!