Full Code of r-lib/gh for AI

main e9194ae1bb29 cached
75 files
133.1 KB
40.6k tokens
1 requests
Download .txt
Repository: r-lib/gh
Branch: main
Commit: e9194ae1bb29
Files: 75
Total size: 133.1 KB

Directory structure:
gitextract_n0is0qh2/

├── .Rbuildignore
├── .github/
│   ├── .gitignore
│   ├── CODEOWNERS
│   ├── CODE_OF_CONDUCT.md
│   └── workflows/
│       ├── R-CMD-check.yaml
│       ├── pkgdown.yaml
│       ├── pr-commands.yaml
│       ├── rhub.yaml
│       └── test-coverage.yaml
├── .gitignore
├── .vscode/
│   ├── extensions.json
│   └── settings.json
├── DESCRIPTION
├── LICENSE
├── LICENSE.md
├── Makefile
├── NAMESPACE
├── NEWS.md
├── R/
│   ├── gh-package.R
│   ├── gh.R
│   ├── gh_gql.R
│   ├── gh_rate_limit.R
│   ├── gh_request.R
│   ├── gh_response.R
│   ├── gh_token.R
│   ├── gh_whoami.R
│   ├── git.R
│   ├── import-standalone-purrr.R
│   ├── pagination.R
│   ├── print.R
│   └── utils.R
├── README.Rmd
├── README.md
├── _pkgdown.yml
├── air.toml
├── codecov.yml
├── gh.Rproj
├── inst/
│   └── WORDLIST
├── man/
│   ├── gh-package.Rd
│   ├── gh.Rd
│   ├── gh_gql.Rd
│   ├── gh_next.Rd
│   ├── gh_rate_limit.Rd
│   ├── gh_token.Rd
│   ├── gh_tree_remote.Rd
│   ├── gh_whoami.Rd
│   └── print.gh_response.Rd
├── tests/
│   ├── testthat/
│   │   ├── _snaps/
│   │   │   ├── gh.md
│   │   │   ├── gh_rate_limit.md
│   │   │   ├── gh_request.md
│   │   │   ├── gh_response.md
│   │   │   ├── gh_token.md
│   │   │   ├── gh_whoami.md
│   │   │   ├── pagination.md
│   │   │   ├── print.md
│   │   │   └── utils.md
│   │   ├── helper-offline.R
│   │   ├── helper.R
│   │   ├── setup.R
│   │   ├── test-gh.R
│   │   ├── test-gh_rate_limit.R
│   │   ├── test-gh_request.R
│   │   ├── test-gh_response.R
│   │   ├── test-gh_token.R
│   │   ├── test-gh_whoami.R
│   │   ├── test-git.R
│   │   ├── test-mock-repos.R
│   │   ├── test-old-templates.R
│   │   ├── test-pagination.R
│   │   ├── test-print.R
│   │   ├── test-spelling.R
│   │   └── test-utils.R
│   └── testthat.R
└── vignettes/
    ├── .gitignore
    └── managing-personal-access-tokens.Rmd

================================================
FILE CONTENTS
================================================

================================================
FILE: .Rbuildignore
================================================
^.*\.Rproj$
^\.Rproj\.user$
^Makefile$
^README.Rmd$
^.travis.yml$
^appveyor.yml$
^tags$
^tests/testthat/github-token\.txt$
^CONTRIBUTING\.md$
^tests/testthat/httrmock$
^\.github$
^\.Rprofile$
^r-packages$
^revdep$
^_pkgdown\.yml$
^docs$
^pkgdown$
^codecov\.yml$
^LICENSE\.md$
^[\.]?air\.toml$
^\.vscode$


================================================
FILE: .github/.gitignore
================================================
*.html


================================================
FILE: .github/CODEOWNERS
================================================
# CODEOWNERS for gh
# https://www.tidyverse.org/development/understudies
.github/CODEOWNERS @gaborcsardi @jennybc


================================================
FILE: .github/CODE_OF_CONDUCT.md
================================================
# Contributor Covenant Code of Conduct

## Our Pledge

We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, caste, color, religion, or sexual
identity and orientation.

We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.

## Our Standards

Examples of behavior that contributes to a positive environment for our
community include:

* Demonstrating empathy and kindness toward other people
* Being respectful of differing opinions, viewpoints, and experiences
* Giving and gracefully accepting constructive feedback
* Accepting responsibility and apologizing to those affected by our mistakes,
  and learning from the experience
* Focusing on what is best not just for us as individuals, but for the overall
  community

Examples of unacceptable behavior include:

* The use of sexualized language or imagery, and sexual attention or advances of
  any kind
* Trolling, insulting or derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or email address,
  without their explicit permission
* Other conduct which could reasonably be considered inappropriate in a
  professional setting

## Enforcement Responsibilities

Community leaders are responsible for clarifying and enforcing our standards of
acceptable behavior and will take appropriate and fair corrective action in
response to any behavior that they deem inappropriate, threatening, offensive,
or harmful.

Community leaders have the right and responsibility to remove, edit, or reject
comments, commits, code, wiki edits, issues, and other contributions that are
not aligned to this Code of Conduct, and will communicate reasons for moderation
decisions when appropriate.

## Scope

This Code of Conduct applies within all community spaces, and also applies when
an individual is officially representing the community in public spaces.
Examples of representing our community include using an official e-mail address,
posting via an official social media account, or acting as an appointed
representative at an online or offline event.

## Enforcement

Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at codeofconduct@posit.co. 
All complaints will be reviewed and investigated promptly and fairly.

All community leaders are obligated to respect the privacy and security of the
reporter of any incident.

## Enforcement Guidelines

Community leaders will follow these Community Impact Guidelines in determining
the consequences for any action they deem in violation of this Code of Conduct:

### 1. Correction

**Community Impact**: Use of inappropriate language or other behavior deemed
unprofessional or unwelcome in the community.

**Consequence**: A private, written warning from community leaders, providing
clarity around the nature of the violation and an explanation of why the
behavior was inappropriate. A public apology may be requested.

### 2. Warning

**Community Impact**: A violation through a single incident or series of
actions.

**Consequence**: A warning with consequences for continued behavior. No
interaction with the people involved, including unsolicited interaction with
those enforcing the Code of Conduct, for a specified period of time. This
includes avoiding interactions in community spaces as well as external channels
like social media. Violating these terms may lead to a temporary or permanent
ban.

### 3. Temporary Ban

**Community Impact**: A serious violation of community standards, including
sustained inappropriate behavior.

**Consequence**: A temporary ban from any sort of interaction or public
communication with the community for a specified period of time. No public or
private interaction with the people involved, including unsolicited interaction
with those enforcing the Code of Conduct, is allowed during this period.
Violating these terms may lead to a permanent ban.

### 4. Permanent Ban

**Community Impact**: Demonstrating a pattern of violation of community
standards, including sustained inappropriate behavior, harassment of an
individual, or aggression toward or disparagement of classes of individuals.

**Consequence**: A permanent ban from any sort of public interaction within the
community.

## Attribution

This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.1, available at
<https://www.contributor-covenant.org/version/2/1/code_of_conduct.html>.

Community Impact Guidelines were inspired by
[Mozilla's code of conduct enforcement ladder][https://github.com/mozilla/inclusion].

For answers to common questions about this code of conduct, see the FAQ at
<https://www.contributor-covenant.org/faq>. Translations are available at <https://www.contributor-covenant.org/translations>.

[homepage]: https://www.contributor-covenant.org


================================================
FILE: .github/workflows/R-CMD-check.yaml
================================================
# Workflow derived from https://github.com/r-lib/actions/tree/v2/examples
# Need help debugging build failures? Start at https://github.com/r-lib/actions#where-to-find-help
#
# NOTE: This workflow is overkill for most R packages and
# check-standard.yaml is likely a better choice.
# usethis::use_github_action("check-standard") will install it.
on:
  push:
    branches: [main, master]
  pull_request:

name: R-CMD-check.yaml

permissions: read-all

jobs:
  R-CMD-check:
    runs-on: ${{ matrix.config.os }}

    name: ${{ matrix.config.os }} (${{ matrix.config.r }})

    strategy:
      fail-fast: false
      matrix:
        config:
          - {os: macos-latest,   r: 'release'}

          - {os: windows-latest, r: 'release'}
          # use 4.0 or 4.1 to check with rtools40's older compiler
          - {os: windows-latest, r: 'oldrel-4'}

          - {os: ubuntu-latest,  r: 'devel', http-user-agent: 'release'}
          - {os: ubuntu-latest,  r: 'release'}
          - {os: ubuntu-latest,  r: 'oldrel-1'}
          - {os: ubuntu-latest,  r: 'oldrel-2'}
          - {os: ubuntu-latest,  r: 'oldrel-3'}
          - {os: ubuntu-latest,  r: 'oldrel-4'}

    env:
      GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}
      R_KEEP_PKG_SOURCE: yes

    steps:
      - uses: actions/checkout@v4

      - uses: r-lib/actions/setup-pandoc@v2

      - uses: r-lib/actions/setup-r@v2
        with:
          r-version: ${{ matrix.config.r }}
          http-user-agent: ${{ matrix.config.http-user-agent }}
          use-public-rspm: true

      - uses: r-lib/actions/setup-r-dependencies@v2
        with:
          extra-packages: any::rcmdcheck
          needs: check

      - uses: r-lib/actions/check-r-package@v2
        with:
          upload-snapshots: true
          build_args: 'c("--no-manual","--compact-vignettes=gs+qpdf")'


================================================
FILE: .github/workflows/pkgdown.yaml
================================================
# Workflow derived from https://github.com/r-lib/actions/tree/v2/examples
# Need help debugging build failures? Start at https://github.com/r-lib/actions#where-to-find-help
on:
  push:
    branches: [main, master]
  pull_request:
  release:
    types: [published]
  workflow_dispatch:

name: pkgdown.yaml

permissions: read-all

jobs:
  pkgdown:
    runs-on: ubuntu-latest
    # Only restrict concurrency for non-PR jobs
    concurrency:
      group: pkgdown-${{ github.event_name != 'pull_request' || github.run_id }}
    env:
      GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}
    permissions:
      contents: write
    steps:
      - uses: actions/checkout@v4

      - uses: r-lib/actions/setup-pandoc@v2

      - uses: r-lib/actions/setup-r@v2
        with:
          use-public-rspm: true

      - uses: r-lib/actions/setup-r-dependencies@v2
        with:
          extra-packages: any::pkgdown, local::.
          needs: website

      - name: Build site
        run: pkgdown::build_site_github_pages(new_process = FALSE, install = FALSE)
        shell: Rscript {0}

      - name: Deploy to GitHub pages 🚀
        if: github.event_name != 'pull_request'
        uses: JamesIves/github-pages-deploy-action@v4.5.0
        with:
          clean: false
          branch: gh-pages
          folder: docs


================================================
FILE: .github/workflows/pr-commands.yaml
================================================
# Workflow derived from https://github.com/r-lib/actions/tree/v2/examples
# Need help debugging build failures? Start at https://github.com/r-lib/actions#where-to-find-help
on:
  issue_comment:
    types: [created]

name: pr-commands.yaml

permissions: read-all

jobs:
  document:
    if: ${{ github.event.issue.pull_request && (github.event.comment.author_association == 'MEMBER' || github.event.comment.author_association == 'OWNER') && startsWith(github.event.comment.body, '/document') }}
    name: document
    runs-on: ubuntu-latest
    env:
      GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}
    permissions:
      contents: write
    steps:
      - uses: actions/checkout@v4

      - uses: r-lib/actions/pr-fetch@v2
        with:
          repo-token: ${{ secrets.GITHUB_TOKEN }}

      - uses: r-lib/actions/setup-r@v2
        with:
          use-public-rspm: true

      - uses: r-lib/actions/setup-r-dependencies@v2
        with:
          extra-packages: any::roxygen2
          needs: pr-document

      - name: Document
        run: roxygen2::roxygenise()
        shell: Rscript {0}

      - name: commit
        run: |
          git config --local user.name "$GITHUB_ACTOR"
          git config --local user.email "$GITHUB_ACTOR@users.noreply.github.com"
          git add man/\* NAMESPACE
          git commit -m 'Document'

      - uses: r-lib/actions/pr-push@v2
        with:
          repo-token: ${{ secrets.GITHUB_TOKEN }}

  style:
    if: ${{ github.event.issue.pull_request && (github.event.comment.author_association == 'MEMBER' || github.event.comment.author_association == 'OWNER') && startsWith(github.event.comment.body, '/style') }}
    name: style
    runs-on: ubuntu-latest
    env:
      GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}
    permissions:
      contents: write
    steps:
      - uses: actions/checkout@v4

      - uses: r-lib/actions/pr-fetch@v2
        with:
          repo-token: ${{ secrets.GITHUB_TOKEN }}

      - uses: r-lib/actions/setup-r@v2

      - name: Install dependencies
        run: install.packages("styler")
        shell: Rscript {0}

      - name: Style
        run: styler::style_pkg()
        shell: Rscript {0}

      - name: commit
        run: |
          git config --local user.name "$GITHUB_ACTOR"
          git config --local user.email "$GITHUB_ACTOR@users.noreply.github.com"
          git add \*.R
          git commit -m 'Style'

      - uses: r-lib/actions/pr-push@v2
        with:
          repo-token: ${{ secrets.GITHUB_TOKEN }}


================================================
FILE: .github/workflows/rhub.yaml
================================================
# R-hub's generic GitHub Actions workflow file. It's canonical location is at
# https://github.com/r-hub/rhub2/blob/v1/inst/workflow/rhub.yaml
# You can update this file to a newer version using the rhub2 package:
#
# rhub2::rhub_setup()
#
# It is unlikely that you need to modify this file manually.

name: R-hub
run-name: "${{ github.event.inputs.id }}: ${{ github.event.inputs.name || format('Manually run by {0}', github.triggering_actor) }}"

on:
  workflow_dispatch:
    inputs:
      config:
        description: 'A comma separated list of R-hub platforms to use.'
        type: string
        default: 'linux,windows,macos'
      name:
        description: 'Run name. You can leave this empty now.'
        type: string
      id:
        description: 'Unique ID. You can leave this empty now.'
        type: string

jobs:

  setup:
    runs-on: ubuntu-latest
    outputs:
      containers: ${{ steps.rhub-setup.outputs.containers }}
      platforms: ${{ steps.rhub-setup.outputs.platforms }}

    steps:
    # NO NEED TO CHECKOUT HERE
    - uses: r-hub/rhub2/actions/rhub-setup@v1
      with:
        config: ${{ github.event.inputs.config }}
      id: rhub-setup

  linux-containers:
    needs: setup
    if: ${{ needs.setup.outputs.containers != '[]' }}
    runs-on: ubuntu-latest
    name: ${{ matrix.config.label }}
    strategy:
      fail-fast: false
      matrix:
        config: ${{ fromJson(needs.setup.outputs.containers) }}
    container:
      image: ${{ matrix.config.container }}

    steps:
      - uses: actions/checkout@v4
      - uses: r-hub/rhub2/actions/rhub-check@v1
        with:
          token: ${{ secrets.RHUB_TOKEN }}
          job-config: ${{ matrix.config.job-config }}

  other-platforms:
    needs: setup
    if: ${{ needs.setup.outputs.platforms != '[]' }}
    runs-on: ${{ matrix.config.os }}
    name: ${{ matrix.config.label }}
    strategy:
      fail-fast: false
      matrix:
        config: ${{ fromJson(needs.setup.outputs.platforms) }}

    steps:
      - uses: actions/checkout@v4
      - uses: r-hub/rhub2/actions/rhub-setup-r@v1
        with:
          job-config: ${{ matrix.config.job-config }}
          token: ${{ secrets.RHUB_TOKEN }}
      - uses: r-hub/rhub2/actions/rhub-check@v1
        with:
          job-config: ${{ matrix.config.job-config }}
          token: ${{ secrets.RHUB_TOKEN }}


================================================
FILE: .github/workflows/test-coverage.yaml
================================================
# Workflow derived from https://github.com/r-lib/actions/tree/v2/examples
# Need help debugging build failures? Start at https://github.com/r-lib/actions#where-to-find-help
on:
  push:
    branches: [main, master]
  pull_request:

name: test-coverage.yaml

permissions: read-all

jobs:
  test-coverage:
    runs-on: ubuntu-latest
    env:
      GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}

    steps:
      - uses: actions/checkout@v4

      - uses: r-lib/actions/setup-r@v2
        with:
          use-public-rspm: true

      - uses: r-lib/actions/setup-r-dependencies@v2
        with:
          extra-packages: any::covr, any::xml2
          needs: coverage

      - name: Test coverage
        run: |
          cov <- covr::package_coverage(
            quiet = FALSE,
            clean = FALSE,
            install_path = file.path(normalizePath(Sys.getenv("RUNNER_TEMP"), winslash = "/"), "package")
          )
          print(cov)
          covr::to_cobertura(cov)
        shell: Rscript {0}

      - uses: codecov/codecov-action@v5
        with:
          # Fail if error if not on PR, or if on PR and token is given
          fail_ci_if_error: ${{ github.event_name != 'pull_request' || secrets.CODECOV_TOKEN }}
          files: ./cobertura.xml
          plugins: noop
          disable_search: true
          token: ${{ secrets.CODECOV_TOKEN }}

      - name: Show testthat output
        if: always()
        run: |
          ## --------------------------------------------------------------------
          find '${{ runner.temp }}/package' -name 'testthat.Rout*' -exec cat '{}' \; || true
        shell: bash

      - name: Upload test results
        if: failure()
        uses: actions/upload-artifact@v4
        with:
          name: coverage-test-failures
          path: ${{ runner.temp }}/package


================================================
FILE: .gitignore
================================================
.Rproj.user
.Rhistory
.RData
/tags
/tests/testthat/github-token.txt
/r-packages
/revdep
docs
inst/doc


================================================
FILE: .vscode/extensions.json
================================================
{
    "recommendations": [
        "Posit.air-vscode"
    ]
}


================================================
FILE: .vscode/settings.json
================================================
{
    "[r]": {
        "editor.formatOnSave": true,
        "editor.defaultFormatter": "Posit.air-vscode"
    },
    "makefile.configureOnOpen": false
}


================================================
FILE: DESCRIPTION
================================================
Package: gh
Title: 'GitHub' 'API'
Version: 1.5.0.9000
Authors@R: c(
    person("Gábor", "Csárdi", , "csardi.gabor@gmail.com", role = c("cre", "ctb")),
    person("Jennifer", "Bryan", role = "aut"),
    person("Hadley", "Wickham", role = "aut"),
    person("Posit Software, PBC", role = c("cph", "fnd"),
           comment = c(ROR = "03wc8by49"))
  )
Description: Minimal client to access the 'GitHub' 'API'.
License: MIT + file LICENSE
URL: https://gh.r-lib.org/, https://github.com/r-lib/gh#readme
BugReports: https://github.com/r-lib/gh/issues
Depends:
    R (>= 4.1)
Imports:
    cli (>= 3.0.1),
    gitcreds,
    glue,
    httr2 (>= 1.0.6),
    ini,
    jsonlite,
    lifecycle,
    rlang (>= 1.0.0)
Suggests:
    connectcreds,
    covr,
    knitr,
    rmarkdown,
    rprojroot,
    spelling,
    testthat (>= 3.0.0),
    vctrs,
    withr
VignetteBuilder:
    knitr
Config/Needs/website: tidyverse/tidytemplate
Config/testthat/edition: 3
Config/usethis/last-upkeep: 2025-04-29
Encoding: UTF-8
Language: en-US
Roxygen: list(markdown = TRUE)
RoxygenNote: 7.3.2.9000


================================================
FILE: LICENSE
================================================
YEAR: 2025
COPYRIGHT HOLDER: gh authors


================================================
FILE: LICENSE.md
================================================
# MIT License

Copyright (c) 2025 gh authors

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.


================================================
FILE: Makefile
================================================

all: README.md

README.md: README.Rmd
	Rscript -e "library(knitr); knit('$<', output = '$@', quiet = TRUE)"


================================================
FILE: NAMESPACE
================================================
# Generated by roxygen2: do not edit by hand

S3method(format,gh_pat)
S3method(print,gh_pat)
S3method(print,gh_response)
S3method(str,gh_pat)
S3method(vctrs::vec_cast,list.gh_response)
S3method(vctrs::vec_ptype2,gh_response.gh_response)
export(gh)
export(gh_first)
export(gh_gql)
export(gh_last)
export(gh_next)
export(gh_prev)
export(gh_rate_limit)
export(gh_rate_limits)
export(gh_token)
export(gh_token_exists)
export(gh_tree_remote)
export(gh_whoami)
import(rlang)
importFrom(cli,cli_status)
importFrom(cli,cli_status_update)
importFrom(glue,glue)
importFrom(jsonlite,fromJSON)
importFrom(jsonlite,prettify)
importFrom(jsonlite,toJSON)
importFrom(lifecycle,deprecated)
importFrom(utils,URLencode)
importFrom(utils,capture.output)


================================================
FILE: NEWS.md
================================================
# gh (development version)

# gh 1.5.0

## BREAKING CHANGES

### Posit Security Advisory(PSA) - PSA-1649

* Posit acknowledges that the response header may contain sensitive
  information. (#222) Thank you to @foysal1197 for your thorough research
  and responsible disclosure.

 `gh()`, and other functions that use it, now do not save the request
  headers in the returned object. Consequently, if you use the `gh_next()`,
  `gh_prev()`, `gh_first()` or `gh_last()` functions and passed `.token`
  and/or `.send_headers` explicitly to the original `gh()` (or similar)
  call, then you'll also need to pass the same `.token` and/or
  `.send_headers` to `gh_next()`, `gh_prev()`, `gh_first()` or `gh_last()`.

## OTHER CHANGES

* New `gh_token_exists()` tells you if a valid GH token has been set.

* `gh()` now uses a cache provided by httr2. This cache lives in
  `tools::R_user_dir("gh", "cache")`, maxes out at 100 MB, and can be
  disabled by setting `options(gh_cache = FALSE)` (#203).

* `gh_token()` can now pick up on the viewer's GitHub credentials (if any)
  when running on Posit Connect (@atheriel, #217).

# gh 1.4.1

* `gh_next()`, `gh_prev()`, `gh_first()` and `gh_last()`
  now work correctly again (#181).

* When the user sets `.destfile` to write the response to disk, gh now
  writes the output to a temporary file, which is then renamed to
  `.destfile` after performing the request, or deleted on error (#178).

# gh 1.4.0

* `gh()` gains a new `.max_rate` parameter that sets the maximum number of
  requests per second.

* gh is now powered by httr2. This should generally have little impact on normal
  operation but if a request fails, you can use `httr2::last_response()` and
  `httr2::last_request()` to debug.

* `gh()` gains a new `.max_wait` argument which gives the maximum number of
  minutes to wait if you are rate limited (#67).

* New `gh_rate_limits()` function reports on all rate limits for the active
  user.

* gh can now validate GitHub
  [fine-grained](https://github.blog/security/application-security/introducing-fine-grained-personal-access-tokens-for-github/)
  personal access tokens (@jvstein, #171).

# gh 1.3.1

* gh now accepts lower-case methods i.e. both `gh::gh("get /users/hadley/repos")` and `gh::gh("GET /users/hadley/repos")` work (@maelle, #167).

* Response headers (`"response_headers"`) and response content
  (`"response_content")` are now returned in error conditions so that error
  handlers can use information, such as the rate limit reset header, when
  handling `github_error`s (@gadenbuie, #117).

# gh 1.3.0

* gh now shows the correct number of records in its progress bar when
  paginating (#147).

* New `.params` argument in `gh()` to make it easier to pass parameters to
  it programmatically (#140).

# gh 1.2.1

* Token validation accounts for the new format
  [announced 2021-03-04 ](https://github.blog/changelog/2021-03-04-authentication-token-format-updates/)
  and implemented on 2021-04-01 (#148, @fmichonneau).

# gh 1.2.0

* `gh_gql()` now passes all arguments to `gh()` (#124).

* gh now handles responses from pagination better, and tries to properly
  merge them (#136, @rundel).

* gh can retrieve a PAT from the Git credential store, where the lookup is
  based on the targeted API URL. This now uses the gitcreds package. The
  environment variables consulted for URL-specific GitHub PATs have changed.
  - For "https://api.github.com": `GITHUB_PAT_GITHUB_COM` now, instead of
    `GITHUB_PAT_API_GITHUB_COM`
  - For "https://github.acme.com/api/v3": `GITHUB_PAT_GITHUB_ACME_COM` now,
    instead of `GITHUB_PAT_GITHUB_ACME_COM_API_V3`

  See the documentation of the gitcreds package for details.

* The keyring package is no longer used, in favor of the Git credential
  store.

* The documentation for the GitHub REST API has moved to
  <https://docs.github.com/rest> and endpoints are now documented using
  the URI template style of [RFC 6570](https://www.rfc-editor.org/rfc/rfc6570):
  - Old: `GET /repos/:owner/:repo/issues`
  - New: `GET /repos/{owner}/{repo}/issues`

  gh accepts and prioritizes the new style. However, it still does parameter
  substitution for the old style.

* Fixed an error that occurred when calling `gh()` with `.progress = FALSE`
  (@gadenbuie, #115).

* `gh()` accepts named `NA` parameters that are destined for the request
  body (#139).

# gh 1.1.0

* Raw responses from GitHub are now returned as raw vector.

* Responses may be written to disk by providing a path in the `.destfile`
  argument.

* gh now sets `.Last.error` to the error object after an uncaught error,
  and `.Last.error.trace` to the stack trace of the error.

* `gh()` now silently drops named `NULL` parameters, and throws an
  error for named `NA` parameters (#21, #84).

* `gh()` now returns better values for empty responses, typically empty
  lists or dictionaries (#66).

* `gh()` now has an `.accept` argument to make it easier to set the
  `Accept` HTTP header (#91).

* New `gh_gql()` function to make it easier to work with the GitHub
  GraphQL API.

* gh now supports separate personal access tokens for GitHub Enterprise
  sites. See `?gh_token` for details.

* gh now supports storing your GitHub personal access tokens (PAT) in the
  system keyring, via the keyring package. See `?gh_token` for details.

* `gh()` can now POST raw data, which allows adding assets to releases (#56).

# gh 1.0.1

First public release.


================================================
FILE: R/gh-package.R
================================================
#' @keywords internal
#' @aliases gh-package
"_PACKAGE"

# The following block is used by usethis to automatically manage
# roxygen namespace tags. Modify with care!
## usethis namespace: start
#' @import rlang
#' @importFrom cli cli_status cli_status_update
#' @importFrom glue glue
#' @importFrom jsonlite fromJSON toJSON
#' @importFrom lifecycle deprecated
#' @importFrom utils URLencode capture.output
## usethis namespace: end
NULL


================================================
FILE: R/gh.R
================================================
#' Query the GitHub API
#'
#' This is an extremely minimal client. You need to know the API
#' to be able to use this client. All this function does is:
#' * Try to substitute each listed parameter into `endpoint`, using the
#'   `{parameter}` notation.
#' * If a GET request (the default), then add all other listed parameters
#'   as query parameters.
#' * If not a GET request, then send the other parameters in the request
#'   body, as JSON.
#' * Convert the response to an R list using [jsonlite::fromJSON()].
#'
#' @param endpoint GitHub API endpoint. Must be one of the following forms:
#'    * `METHOD path`, e.g. `GET /rate_limit`,
#'    * `path`, e.g. `/rate_limit`,
#'    * `METHOD url`, e.g. `GET https://api.github.com/rate_limit`,
#'    * `url`, e.g. `https://api.github.com/rate_limit`.
#'
#'    If the method is not supplied, will use `.method`, which defaults
#'    to `"GET"`.
#' @param ... Name-value pairs giving API parameters. Will be matched into
#'   `endpoint` placeholders, sent as query parameters in GET requests, and as a
#'   JSON body of POST requests. If there is only one unnamed parameter, and it
#'   is a raw vector, then it will not be JSON encoded, but sent as raw data, as
#'   is. This can be used for example to add assets to releases. Named `NULL`
#'   values are silently dropped. For GET requests, named `NA` values trigger an
#'   error. For other methods, named `NA` values are included in the body of the
#'   request, as JSON `null`.
#' @param per_page,.per_page Number of items to return per page. If omitted,
#'   will be substituted by `max(.limit, 100)` if `.limit` is set,
#'   otherwise determined by the API (never greater than 100).
#' @param .destfile Path to write response to disk. If `NULL` (default),
#'   response will be processed and returned as an object. If path is given,
#'   response will be written to disk in the form sent. gh writes the
#'   response to a temporary file, and renames that file to `.destfile`
#'   after the request was successful. The name of the temporary file is
#'   created by adding a `-<random>.gh-tmp` suffix to it, where `<random>`
#'   is an ASCII string with random characters. gh removes the temporary
#'   file on error.
#' @param .overwrite If `.destfile` is provided, whether to overwrite an
#'   existing file.  Defaults to `FALSE`. If an error happens the original
#'   file is kept.
#' @param .token Authentication token. Defaults to [gh_token()].
#' @param .api_url Github API url (default: <https://api.github.com>). Used
#'   if `endpoint` just contains a path. Defaults to `GITHUB_API_URL`
#'   environment variable if set.
#' @param .method HTTP method to use if not explicitly supplied in the
#'    `endpoint`.
#' @param .limit Number of records to return. This can be used
#'   instead of manual pagination. By default it is `NULL`,
#'   which means that the defaults of the GitHub API are used.
#'   You can set it to a number to request more (or less)
#'   records, and also to `Inf` to request all records.
#'   Note, that if you request many records, then multiple GitHub
#'   API calls are used to get them, and this can take a potentially
#'   long time.
#' @param .accept The value of the `Accept` HTTP header. Defaults to
#'   `"application/vnd.github.v3+json"` . If `Accept` is given in
#'   `.send_headers`, then that will be used. This parameter can be used to
#'   provide a custom media type, in order to access a preview feature of
#'   the API.
#' @param .send_headers Named character vector of header field values
#'   (except `Authorization`, which is handled via `.token`). This can be
#'   used to override or augment the default `User-Agent` header:
#'   `"https://github.com/r-lib/gh"`.
#' @param .progress Whether to show a progress indicator for calls that
#'   need more than one HTTP request.
#' @param .params Additional list of parameters to append to `...`.
#'   It is easier to use this than `...` if you have your parameters in
#'   a list already.
#' @param .max_wait Maximum number of seconds to wait if rate limited.
#'   Defaults to 10 minutes.
#' @param .max_rate Maximum request rate in requests per second. Set
#'   this to automatically throttle requests.
#' @return Answer from the API as a `gh_response` object, which is also a
#'   `list`. Failed requests will generate an R error. Requests that
#'   generate a raw response will return a raw vector.
#'
#' @export
#' @seealso [gh_gql()] if you want to use the GitHub GraphQL API,
#' [gh_whoami()] for details on GitHub API token management.
#' @examplesIf identical(Sys.getenv("IN_PKGDOWN"), "true")
#' ## Repositories of a user, these are equivalent
#' gh("/users/hadley/repos", .limit = 2)
#' gh("/users/{username}/repos", username = "hadley", .limit = 2)
#'
#' ## Starred repositories of a user
#' gh("/users/hadley/starred", .limit = 2)
#' gh("/users/{username}/starred", username = "hadley", .limit = 2)
#' @examplesIf FALSE
#' ## Create a repository, needs a token (see gh_token())
#' gh("POST /user/repos", name = "foobar")
#' @examplesIf identical(Sys.getenv("IN_PKGDOWN"), "true")
#' ## Issues of a repository
#' gh("/repos/hadley/dplyr/issues")
#' gh("/repos/{owner}/{repo}/issues", owner = "hadley", repo = "dplyr")
#'
#' ## Automatic pagination
#' users <- gh("/users", .limit = 50)
#' length(users)
#' @examplesIf FALSE
#' ## Access developer preview of Licenses API (in preview as of 2015-09-24)
#' gh("/licenses") # used to error code 415
#' gh("/licenses", .accept = "application/vnd.github.drax-preview+json")
#' @examplesIf FALSE
#' ## Access Github Enterprise API
#' ## Use GITHUB_API_URL environment variable to change the default.
#' gh("/user/repos", type = "public", .api_url = "https://github.foobar.edu/api/v3")
#' @examplesIf FALSE
#' ## Use I() to force body part to be sent as an array, even if length 1
#' ## This works whether assignees has length 1 or > 1
#' assignees <- "gh_user"
#' assignees <- c("gh_user1", "gh_user2")
#' gh("PATCH /repos/OWNER/REPO/issues/1", assignees = I(assignees))
#' @examplesIf FALSE
#' ## There are two ways to send JSON data. One is that you supply one or
#' ## more objects that will be converted to JSON automatically via
#' ## jsonlite::toJSON(). In this case sometimes you need to use
#' ## jsonlite::unbox() because fromJSON() creates lists from scalar vectors
#' ## by default. The Content-Type header is automatically added in this
#' ## case. For example this request turns on GitHub Pages, using this
#' ## API: https://docs.github.com/v3/repos/pages/#enable-a-pages-site
#'
#' gh::gh(
#'   "POST /repos/{owner}/{repo}/pages",
#'   owner = "r-lib",
#'   repo = "gh",
#'   source = list(
#'     branch = jsonlite::unbox("gh-pages"),
#'     path = jsonlite::unbox("/")
#'   ),
#'   .send_headers = c(Accept = "application/vnd.github.switcheroo-preview+json")
#' )
#'
#' ## The second way is to handle the JSON encoding manually, and supply it
#' ## as a raw vector in an unnamed argument, and also a Content-Type header:
#'
#' body <- '{ "source": { "branch": "gh-pages", "path": "/" } }'
#' gh::gh(
#'   "POST /repos/{owner}/{repo}/pages",
#'   owner = "r-lib",
#'   repo = "gh",
#'   charToRaw(body),
#'   .send_headers = c(
#'     Accept = "application/vnd.github.switcheroo-preview+json",
#'     "Content-Type" = "application/json"
#'   )
#' )
#' @examplesIf FALSE
#' ## Pass along a query to the search/code endpoint via the ... argument
#' x <- gh::gh(
#'             "/search/code",
#'             q = "installation repo:r-lib/gh",
#'             .send_headers = c("X-GitHub-Api-Version" = "2022-11-28")
#'             )
#'  str(x, list.len = 3, give.attr = FALSE)
#'
#'
gh <- function(
  endpoint,
  ...,
  per_page = NULL,
  .per_page = NULL,
  .token = NULL,
  .destfile = NULL,
  .overwrite = FALSE,
  .api_url = NULL,
  .method = "GET",
  .limit = NULL,
  .accept = "application/vnd.github.v3+json",
  .send_headers = NULL,
  .progress = TRUE,
  .params = list(),
  .max_wait = 600,
  .max_rate = NULL
) {
  params <- .parse_params(..., .params = .params)

  check_exclusive(per_page, .per_page, .require = FALSE)
  per_page <- per_page %||% .per_page
  if (is.null(per_page) && !is.null(.limit)) {
    per_page <- max(min(.limit, 100), 1)
  }
  if (!is.null(per_page)) {
    params <- c(params, list(per_page = per_page))
  }

  req <- gh_build_request(
    endpoint = endpoint,
    params = params,
    token = .token,
    destfile = .destfile,
    overwrite = .overwrite,
    accept = .accept,
    send_headers = .send_headers,
    max_wait = .max_wait,
    max_rate = .max_rate,
    api_url = .api_url,
    method = .method
  )

  if (req$method == "GET") check_named_nas(params)

  raw <- gh_make_request(req)
  res <- gh_process_response(raw, req)
  len <- gh_response_length(res)

  if (.progress && !is.null(.limit)) {
    pages <- min(gh_extract_pages(res), ceiling(.limit / per_page))
    cli::cli_progress_bar("Running gh query", total = pages)
    cli::cli_progress_update() # already done one
  }

  while (!is.null(.limit) && len < .limit && gh_has_next(res)) {
    res2 <- gh_next(res, .token = .token, .send_headers = .send_headers)
    len <- len + gh_response_length(res2)
    if (.progress) cli::cli_progress_update()

    if (!is.null(names(res2)) && identical(names(res), names(res2))) {
      res3 <- mapply(
        # Handle named array case
        function(x, y, n) {
          # e.g. GET /search/repositories
          z <- c(x, y)
          atm <- is.atomic(z)
          if (atm && n %in% c("total_count", "incomplete_results")) {
            y
          } else if (atm) {
            unique(z)
          } else {
            z
          }
        },
        res,
        res2,
        names(res),
        SIMPLIFY = FALSE
      )
    } else {
      # Handle unnamed array case
      res3 <- c(res, res2) # e.g. GET /orgs/:org/invitations
    }

    attributes(res3) <- attributes(res2)
    res <- res3
  }

  if (.progress) cli::cli_progress_done()

  # We only subset for a non-named response.
  if (
    !is.null(.limit) &&
      len > .limit &&
      !"total_count" %in% names(res) &&
      length(res) == len
  ) {
    res_attr <- attributes(res)
    res <- res[seq_len(.limit)]
    attributes(res) <- res_attr
  }

  res
}

gh_response_length <- function(res) {
  if (
    !is.null(names(res)) && length(res) > 1 && names(res)[1] == "total_count"
  ) {
    # Ignore total_count, incomplete_results, repository_selection
    # and take the first list element to get the length
    lst <- vapply(res, is.list, logical(1))
    nm <- setdiff(
      names(res),
      c("total_count", "incomplete_results", "repository_selection")
    )
    tgt <- which(lst[nm])[1]
    if (is.na(tgt)) length(res) else length(res[[nm[tgt]]])
  } else {
    length(res)
  }
}

gh_make_request <- function(x, error_call = caller_env()) {
  if (!x$method %in% c("GET", "POST", "PATCH", "PUT", "DELETE")) {
    cli::cli_abort("Unknown HTTP verb: {.val {x$method}}")
  }

  req <- httr2::request(x$url)
  req <- httr2::req_method(req, x$method)
  req <- httr2::req_url_query(req, !!!x$query)

  if (!is.null((x$body))) {
    if (is.raw(x$body)) {
      req <- httr2::req_body_raw(req, x$body)
    } else {
      req <- httr2::req_body_json(req, x$body, null = "list", digits = 4)
    }
  }
  req <- httr2::req_headers(req, !!!x$headers)

  # Reduce connection timeout from curl's 10s default to 5s
  req <- httr2::req_options(req, connecttimeout_ms = 5000)
  if (Sys.getenv("GH_FORCE_HTTP_1_1") == "true") {
    req <- httr2::req_options(req, http_version = 2)
  }

  if (!isFALSE(getOption("gh_cache"))) {
    req <- httr2::req_cache(
      req,
      max_size = 100 * 1024 * 1024, # 100 MB
      path = tools::R_user_dir("gh", "cache")
    )
  }

  if (!is_testing()) {
    req <- httr2::req_retry(
      req,
      max_tries = 3,
      is_transient = function(resp) github_is_transient(resp, x$max_wait),
      after = github_after
    )
  }

  if (!is.null(x$max_rate)) {
    req <- httr2::req_throttle(req, x$max_rate)
  }

  # allow custom handling with gh_error
  req <- httr2::req_error(req, is_error = function(resp) FALSE)

  resp <- httr2::req_perform(req, path = x$desttmp)
  if (httr2::resp_status(resp) >= 400) {
    gh_error(resp, gh_req = x, error_call = error_call)
  }

  resp
}

# https://docs.github.com/v3/#client-errors
gh_error <- function(response, gh_req, error_call = caller_env()) {
  heads <- httr2::resp_headers(response)
  res <- httr2::resp_body_json(response)
  status <- httr2::resp_status(response)
  if (!is.null(gh_req$desttmp)) unlink(gh_req$desttmp)

  msg <- "GitHub API error ({status}): {heads$status %||% ''} {res$message}"

  if (status == 404) {
    msg <- c(msg, x = c("URL not found: {.url {response$url}}"))
  }

  doc_url <- res$documentation_url
  if (!is.null(doc_url)) {
    msg <- c(msg, c("i" = "Read more at {.url {doc_url}}"))
  }

  errors <- res$errors
  if (!is.null(errors)) {
    errors <- as.data.frame(do.call(rbind, errors))
    nms <- c("resource", "field", "code", "message")
    nms <- nms[nms %in% names(errors)]
    msg <- c(
      msg,
      capture.output(print(errors[nms], row.names = FALSE))
    )
  }

  cli::cli_abort(
    msg,
    class = c("github_error", paste0("http_error_", status)),
    call = error_call,
    response_headers = heads,
    response_content = res
  )
}


# use retry-after info when possible
# https://docs.github.com/en/rest/overview/resources-in-the-rest-api#exceeding-the-rate-limit
github_is_transient <- function(resp, max_wait) {
  if (httr2::resp_status(resp) != 403) {
    return(FALSE)
  }
  if (!identical(httr2::resp_header(resp, "x-ratelimit-remaining"), "0")) {
    return(FALSE)
  }

  time <- httr2::resp_header(resp, "x-ratelimit-reset")
  if (is.null(time)) {
    return(FALSE)
  }

  time <- as.numeric(time)
  minutes_to_wait <- (time - unclass(Sys.time()))
  minutes_to_wait <= max_wait
}
github_after <- function(resp) {
  time <- as.numeric(httr2::resp_header(resp, "x-ratelimit-reset"))
  time - unclass(Sys.time())
}


================================================
FILE: R/gh_gql.R
================================================
#' A simple interface for the GitHub GraphQL API v4.
#'
#' See more about the GraphQL API here:
#' <https://docs.github.com/graphql>
#'
#' Note: pagination and the `.limit` argument does not work currently,
#' as pagination in the GraphQL API is different from the v3 API.
#' If you need pagination with GraphQL, you'll need to do that manually.
#'
#' @inheritParams gh
#' @param query The GraphQL query, as a string.
#' @export
#' @seealso [gh()] for the GitHub v3 API.
#' @examplesIf FALSE
#' gh_gql("query { viewer { login }}")
#'
#' # Get rate limit
#' ratelimit_query <- "query {
#'   viewer {
#'     login
#'   }
#'   rateLimit {
#'     limit
#'     cost
#'     remaining
#'     resetAt
#'   }
#' }"
#'
#' gh_gql(ratelimit_query)
gh_gql <- function(query, ...) {
  if (".limit" %in% names(list(...))) {
    stop("`.limit` does not work with the GraphQL API")
  }

  gh(endpoint = "POST /graphql", query = query, ...)
}


================================================
FILE: R/gh_rate_limit.R
================================================
#' Return GitHub user's current rate limits
#'
#' @description
#' `gh_rate_limits()` reports on all rate limits for the authenticated user.
#' `gh_rate_limit()` reports on rate limits for previous successful request.
#'
#' Further details on GitHub's API rate limit policies are available at
#' <https://docs.github.com/v3/#rate-limiting>.
#'
#' @param response `gh_response` object from a previous `gh` call, rate
#' limit values are determined from values in the response header.
#' Optional argument, if missing a call to "GET /rate_limit" will be made.
#'
#' @inheritParams gh
#'
#' @return A `list` object containing the overall `limit`, `remaining` limit, and the
#' limit `reset` time.
#'
#' @export

gh_rate_limit <- function(
  response = NULL,
  .token = NULL,
  .api_url = NULL,
  .send_headers = NULL
) {
  if (is.null(response)) {
    # This end point does not count against limit
    .token <- .token %||% gh_token(.api_url)
    response <- gh(
      "GET /rate_limit",
      .token = .token,
      .api_url = .api_url,
      .send_headers = .send_headers
    )
  }

  stopifnot(inherits(response, "gh_response"))

  http_res <- attr(response, "response")

  reset <- as.integer(c(http_res[["x-ratelimit-reset"]], NA)[1])
  reset <- as.POSIXct(reset, origin = "1970-01-01")

  list(
    limit = as.integer(c(http_res[["x-ratelimit-limit"]], NA)[1]),
    remaining = as.integer(c(http_res[["x-ratelimit-remaining"]], NA)[1]),
    reset = reset
  )
}

#' @export
#' @rdname gh_rate_limit
gh_rate_limits <- function(
  .token = NULL,
  .api_url = NULL,
  .send_headers = NULL
) {
  .token <- .token %||% gh_token(.api_url)
  response <- gh(
    "GET /rate_limit",
    .token = .token,
    .api_url = .api_url,
    .send_headers = .send_headers
  )

  resources <- response$resources

  reset <- .POSIXct(sapply(resources, "[[", "reset"))

  data.frame(
    type = names(resources),
    limit = sapply(resources, "[[", "limit"),
    used = sapply(resources, "[[", "used"),
    remaining = sapply(resources, "[[", "remaining"),
    reset = reset,
    mins_left = round((unclass(reset) - unclass(Sys.time())) / 60, 1),
    stringsAsFactors = FALSE,
    row.names = NULL
  )
}


================================================
FILE: R/gh_request.R
================================================
## Main API URL
default_api_url <- function() {
  Sys.getenv("GITHUB_API_URL", unset = "https://api.github.com")
}

## Headers to send with each API request
default_send_headers <- c("User-Agent" = "https://github.com/r-lib/gh")

gh_build_request <- function(
  endpoint = "/user",
  params = list(),
  token = NULL,
  destfile = NULL,
  overwrite = NULL,
  accept = NULL,
  send_headers = NULL,
  max_wait = 10,
  max_rate = NULL,
  api_url = NULL,
  method = "GET"
) {
  working <- list(
    method = method,
    url = character(),
    headers = NULL,
    query = NULL,
    body = NULL,
    endpoint = endpoint,
    params = params,
    token = token,
    accept = c(Accept = accept),
    send_headers = send_headers,
    api_url = api_url,
    dest = destfile,
    overwrite = overwrite,
    max_wait = max_wait,
    max_rate = max_rate
  )

  working <- gh_set_verb(working)
  working <- gh_set_endpoint(working)
  working <- gh_set_query(working)
  working <- gh_set_body(working)
  working <- gh_set_url(working)
  working <- gh_set_headers(working)
  working <- gh_set_temp_destfile(working)
  working[c(
    "method",
    "url",
    "headers",
    "query",
    "body",
    "dest",
    "desttmp",
    "max_wait",
    "max_rate"
  )]
}


## gh_set_*(x)
## x = a list in which we build up an httr2 request
## x goes in, x comes out, possibly modified

gh_set_verb <- function(x) {
  if (!nzchar(x$endpoint)) {
    return(x)
  }

  # No method defined, so use default
  if (grepl("^/", x$endpoint) || grepl("^http", x$endpoint)) {
    return(x)
  }

  # Method can be lower-case (e.g. copy-pasting from API docs in Firefox)
  method <- gsub("^([^/ ]+)\\s+.*$", "\\1", x$endpoint)
  x$endpoint <- gsub(sprintf("^%s+ ", method), "", x$endpoint)
  # Now switch method to upper-case
  x$method <- toupper(method)
  x
}

gh_set_endpoint <- function(x) {
  params <- x$params
  if (
    !is_template(x$endpoint) || length(params) == 0L || has_no_names(params)
  ) {
    return(x)
  }

  named_params <- which(has_name(params))
  done <- rep_len(FALSE, length(params))
  endpoint <- endpoint2 <- x$endpoint

  for (i in named_params) {
    endpoint2 <- expand_variable(
      varname = names(params)[i],
      value = params[[i]][1],
      template = endpoint
    )
    if (is.na(endpoint2)) {
      cli::cli_abort(
        "Named NA parameters are not allowed: {names(params)[i]}"
      )
    }
    if (endpoint2 != endpoint) {
      endpoint <- endpoint2
      done[i] <- TRUE
    }
    if (!is_template(endpoint)) {
      break
    }
  }

  x$endpoint <- endpoint
  x$params <- x$params[!done]
  x$params <- cleanse_names(x$params)
  x
}

gh_set_query <- function(x) {
  params <- x$params
  if (x$method != "GET" || length(params) == 0L) {
    return(x)
  }
  stopifnot(all(has_name(params)))
  x$query <- params
  x$params <- NULL
  x
}

gh_set_body <- function(x) {
  if (length(x$params) == 0L) {
    return(x)
  }
  if (x$method == "GET") {
    warning("This is a 'GET' request and unnamed parameters are being ignored.")
    return(x)
  }
  if (length(x$params) == 1 && is.raw(x$params[[1]])) {
    x$body <- x$params[[1]]
  } else {
    x$body <- x$params
  }
  x
}

gh_set_url <- function(x) {
  if (grepl("^https?://", x$endpoint)) {
    x$url <- URLencode(x$endpoint)
    x$api_url <- get_baseurl(x$url)
  } else {
    x$api_url <- get_apiurl(x$api_url %||% default_api_url())
    x$url <- URLencode(paste0(x$api_url, x$endpoint))
  }

  x
}

gh_set_temp_destfile <- function(working) {
  working$desttmp <- if (is.null(working$dest)) {
    NULL
  } else {
    paste0(working$dest, "-", basename(tempfile("")), ".gh-tmp")
  }
  working
}

get_baseurl <- function(url) {
  # https://github.uni.edu/api/v3/
  if (!any(grepl("^https?://", url))) {
    stop("Only works with HTTP(S) protocols")
  }
  prot <- sub("^(https?://).*$", "\\1", url) # https://
  rest <- sub("^https?://(.*)$", "\\1", url) #         github.uni.edu/api/v3/
  host <- sub("/.*$", "", rest) #         github.uni.edu
  paste0(prot, host) # https://github.uni.edu
}

# https://api.github.com --> https://github.com
# api.github.com --> github.com
normalize_host <- function(x) {
  sub("api[.]github[.]com", "github.com", x)
}

get_hosturl <- function(url) {
  url <- get_baseurl(url)
  normalize_host(url)
}

# (almost) the inverse of get_hosturl()
# https://github.com     --> https://api.github.com
# https://github.uni.edu --> https://github.uni.edu/api/v3
get_apiurl <- function(url) {
  host_url <- get_hosturl(url)
  prot_host <- strsplit(host_url, "://", fixed = TRUE)[[1]]
  if (is_github_dot_com(host_url)) {
    paste0(prot_host[[1]], "://api.github.com")
  } else {
    paste0(host_url, "/api/v3")
  }
}

is_github_dot_com <- function(url) {
  url <- get_baseurl(url)
  url <- normalize_host(url)
  grepl("^https?://github.com", url)
}

gh_set_headers <- function(x) {
  # x$api_url must be set properly at this point
  auth <- gh_auth(x$token %||% gh_token(x$api_url))
  send_headers <- gh_send_headers(x$accept, x$send_headers)
  x$headers <- c(send_headers, auth)
  x
}

gh_send_headers <- function(accept_header = NULL, headers = NULL) {
  modify_vector(
    modify_vector(default_send_headers, accept_header),
    headers
  )
}

# helpers ----
# https://tools.ietf.org/html/rfc6570
# we support what the RFC calls "Level 1 templates", which only require
# simple string expansion of a placeholder consisting of [A-Za-z0-9_]
is_template <- function(x) {
  is_colon_template(x) || is_uri_template(x)
}

is_colon_template <- function(x) grepl(":", x)

is_uri_template <- function(x) grepl("[{]\\w+?[}]", x)

template_type <- function(x) {
  if (is_uri_template(x)) {
    return("uri")
  }
  if (is_colon_template(x)) {
    return("colon")
  }
}

expand_variable <- function(varname, value, template) {
  type <- template_type(template)
  if (is.null(type)) {
    return(template)
  }
  pattern <- switch(
    type,
    uri = paste0("[{]", varname, "[}]"),
    colon = paste0(":", varname, "\\b"),
    stop("Internal error: unrecognized template type")
  )
  gsub(pattern, value, template)
}


================================================
FILE: R/gh_response.R
================================================
gh_process_response <- function(resp, gh_req) {
  stopifnot(inherits(resp, "httr2_response"))

  content_type <- httr2::resp_content_type(resp)
  gh_media_type <- httr2::resp_header(resp, "x-github-media-type")

  is_raw <- identical(content_type, "application/octet-stream") ||
    isTRUE(grepl("param=raw$", gh_media_type, ignore.case = TRUE))
  is_ondisk <- inherits(resp$body, "httr2_path") && !is.null(gh_req$dest)
  is_empty <- length(resp$body) == 0

  if (is_ondisk) {
    res <- as.character(resp$body)
    file.rename(res, gh_req$dest)
    res <- gh_req$dest
  } else if (is_empty) {
    res <- list()
  } else if (grepl("^application/json", content_type, ignore.case = TRUE)) {
    res <- httr2::resp_body_json(resp)
  } else if (is_raw) {
    res <- httr2::resp_body_raw(resp)
  } else {
    if (grepl("^text/html", content_type, ignore.case = TRUE)) {
      warning("Response came back as html :(", call. = FALSE)
    }
    res <- list(message = httr2::resp_body_string(resp))
  }

  attr(res, "response") <- httr2::resp_headers(resp)
  attr(res, "request") <- remove_headers(gh_req)

  if (is_ondisk) {
    class(res) <- c("gh_response", "path")
  } else if (is_raw) {
    class(res) <- c("gh_response", "raw")
  } else {
    class(res) <- c("gh_response", "list")
  }
  res
}

remove_headers <- function(x) {
  x[names(x) != "headers"]
}

# Add vctrs methods that strip attributes from gh_response when combining,
# enabling rectangling via unnesting etc
# See <https://github.com/r-lib/gh/issues/161> for more details
#' @exportS3Method vctrs::vec_ptype2
vec_ptype2.gh_response.gh_response <- function(x, y, ...) {
  list()
}

#' @exportS3Method vctrs::vec_cast
vec_cast.list.gh_response <- function(x, to, ...) {
  attributes(x) <- NULL
  x
}


================================================
FILE: R/gh_token.R
================================================
#' Return the local user's GitHub Personal Access Token (PAT)
#'
#' @description
#' If gh can find a personal access token (PAT) via `gh_token()`, it includes
#' the PAT in its requests. Some requests succeed without a PAT, but many
#' require a PAT to prove the request is authorized by a specific GitHub user. A
#' PAT also helps with rate limiting. If your gh use is more than casual, you
#' want a PAT.
#'
#' gh calls [gitcreds::gitcreds_get()] with the `api_url`, which checks session
#' environment variables (`GITHUB_PAT`, `GITHUB_TOKEN`)
#' and then the local Git credential store for a PAT
#' appropriate to the `api_url`. Therefore, if you have previously used a PAT
#' with, e.g., command line Git, gh may retrieve and re-use it. You can call
#' [gitcreds::gitcreds_get()] directly, yourself, if you want to see what is
#' found for a specific URL. If no matching PAT is found,
#' [gitcreds::gitcreds_get()] errors, whereas `gh_token()` does not and,
#' instead, returns `""`.
#'
#' See GitHub's documentation on [Creating a personal access
#' token](https://docs.github.com/en/github/authenticating-to-github/creating-a-personal-access-token),
#' or use `usethis::create_github_token()` for a guided experience, including
#' pre-selection of recommended scopes. Once you have a PAT, you can use
#' [gitcreds::gitcreds_set()] to add it to the Git credential store. From that
#' point on, gh (via [gitcreds::gitcreds_get()]) should be able to find it
#' without further effort on your part.
#'
#' @param api_url GitHub API URL. Defaults to the `GITHUB_API_URL` environment
#'   variable, if set, and otherwise to <https://api.github.com>.
#'
#' @return A string of characters, if a PAT is found, or the empty
#'   string, otherwise. For convenience, the return value has an S3 class in
#'   order to ensure that simple printing strategies don't reveal the entire
#'   PAT.
#'
#' @export
#'
#' @examples
#' \dontrun{
#' gh_token()
#'
#' format(gh_token())
#'
#' str(gh_token())
#' }
gh_token <- function(api_url = NULL) {
  api_url <- api_url %||% default_api_url()
  stopifnot(is.character(api_url), length(api_url) == 1)
  host_url <- get_hosturl(api_url)
  # Check for credentials supplied by Posit Connect.
  if (is_installed("connectcreds")) {
    if (connectcreds::has_viewer_token(host_url)) {
      token <- connectcreds::connect_viewer_token(host_url)
      return(gh_pat(token$access_token))
    }
  }
  token <- tryCatch(
    gitcreds::gitcreds_get(host_url),
    error = function(e) NULL
  )
  gh_pat(token$password %||% "")
}

#' @export
#' @rdname gh_token
gh_token_exists <- function(api_url = NULL) {
  tryCatch(nzchar(gh_token(api_url)), error = function(e) FALSE)
}

gh_auth <- function(token) {
  if (isTRUE(token != "")) {
    if (any(grepl("\\W", token))) {
      warning("Token contains whitespace characters")
    }
    c("Authorization" = paste("token", trim_ws(token)))
  } else {
    character()
  }
}

# gh_pat class: exists in order have a print method that hides info ----
new_gh_pat <- function(x) {
  if (is.character(x) && length(x) == 1) {
    structure(x, class = "gh_pat")
  } else {
    cli::cli_abort("A GitHub PAT must be a string")
  }
}

# validates PAT only in a very narrow, technical, and local sense
validate_gh_pat <- function(x) {
  stopifnot(inherits(x, "gh_pat"))
  if (
    x == "" ||
      # https://github.blog/changelog/2021-03-04-authentication-token-format-updates/
      # Fine grained tokens start with "github_pat_".
      # https://github.blog/changelog/2022-10-18-introducing-fine-grained-personal-access-tokens/
      grepl(
        "^(gh[pousr]_[A-Za-z0-9_]{36,251}|github_pat_[A-Za-z0-9_]{36,244})$",
        x
      ) ||
      grepl("^[[:xdigit:]]{40}$", x)
  ) {
    x
  } else {
    url <- "https://gh.r-lib.org/articles/managing-personal-access-tokens.html"
    cli::cli_abort(c(
      "Invalid GitHub PAT format",
      "i" = "A GitHub PAT must have one of three forms:",
      "*" = "40 hexadecimal digits (older PATs)",
      "*" = "A 'ghp_' prefix followed by 36 to 251 more characters (newer PATs)",
      "*" = "A 'github_pat_' prefix followed by 36 to 244 more characters (fine-grained PATs)",
      "i" = "Read more at {.url {url}}."
    ))
  }
}

gh_pat <- function(x) {
  validate_gh_pat(new_gh_pat(x))
}

#' @export
format.gh_pat <- function(x, ...) {
  if (x == "") {
    "<no PAT>"
  } else {
    obfuscate(x)
  }
}

#' @export
print.gh_pat <- function(x, ...) {
  cat(format(x), sep = "\n")
  invisible(x)
}

#' @export
str.gh_pat <- function(object, ...) {
  cat(paste0("<gh_pat> ", format(object), "\n", collapse = ""))
  invisible()
}

obfuscate <- function(x, first = 4, last = 4) {
  paste0(
    substr(x, start = 1, stop = first),
    "...",
    substr(x, start = nchar(x) - last + 1, stop = nchar(x))
  )
}


================================================
FILE: R/gh_whoami.R
================================================
#' Info on current GitHub user and token
#'
#' Reports wallet name, GitHub login, and GitHub URL for the current
#' authenticated user, the first bit of the token, and the associated scopes.
#'
#' Get a personal access token for the GitHub API from
#' <https://github.com/settings/tokens> and select the scopes necessary for your
#' planned tasks. The `repo` scope, for example, is one many are likely to need.
#'
#' On macOS and Windows it is best to store the token in the git credential
#' store, where most GitHub clients, including gh, can access it. You can
#' use the gitcreds package to add your token to the credential store:
#'
#' ```r
#' gitcreds::gitcreds_set()
#' ```
#'
#' See <https://gh.r-lib.org/articles/managing-personal-access-tokens.html>
#' and <https://usethis.r-lib.org/articles/articles/git-credentials.html>
#' for more about managing GitHub (and generic git) credentials.
#'
#' On other systems, including Linux, the git credential store is
#' typically not as convenient, and you might want to store your token in
#' the `GITHUB_PAT` environment variable, which you can set in your
#' `.Renviron` file.
#'
#' @inheritParams gh
#'
#' @return A `gh_response` object, which is also a `list`.
#' @export
#'
#' @examplesIf identical(Sys.getenv("IN_PKGDOWN"), "true")
#' gh_whoami()
#' @examplesIf FALSE
#' ## explicit token + use with GitHub Enterprise
#' gh_whoami(
#'   .token = "8c70fd8419398999c9ac5bacf3192882193cadf2",
#'   .api_url = "https://github.foobar.edu/api/v3"
#' )
gh_whoami <- function(.token = NULL, .api_url = NULL, .send_headers = NULL) {
  .token <- .token %||% gh_token(.api_url)
  if (isTRUE(.token == "")) {
    message(
      "No personal access token (PAT) available.\n",
      "Obtain a PAT from here:\n",
      "https://github.com/settings/tokens\n",
      "For more on what to do with the PAT, see ?gh_whoami."
    )
    return(invisible(NULL))
  }
  res <- gh(
    endpoint = "/user",
    .token = .token,
    .api_url = .api_url,
    .send_headers = .send_headers
  )
  scopes <- attr(res, "response")[["x-oauth-scopes"]]
  res <- res[c("name", "login", "html_url")]
  res$scopes <- scopes
  res$token <- format(gh_pat(.token))
  ## 'gh_response' class has to be restored
  class(res) <- c("gh_response", "list")
  res
}


================================================
FILE: R/git.R
================================================
#' Find the GitHub remote associated with a path
#'
#' This is handy helper if you want to make gh requests related to the
#' current project.
#'
#' @param path Path that is contained within a git repo.
#' @return If the repo has a github remote, a list containing `username`
#'    and `repo`. Otherwise, an error.
#' @export
#' @examplesIf interactive()
#' gh_tree_remote()
gh_tree_remote <- function(path = ".") {
  github_remote(git_remotes(path), path)
}

github_remote <- function(x, path) {
  remotes <- lapply(x, github_remote_parse)
  remotes <- remotes[!vapply(remotes, is.null, logical(1))]

  if (length(remotes) == 0) {
    cli::cli_abort("No GitHub remotes found at {.path {path}}")
  }

  if (length(remotes) > 1) {
    if (any(names(remotes) == "origin")) {
      warning("Multiple github remotes found. Using origin.", call. = FALSE)
      remotes <- remotes[["origin"]]
    } else {
      warning("Multiple github remotes found. Using first.", call. = FALSE)
      remotes <- remotes[[1]]
    }
  } else {
    remotes[[1]]
  }
}

github_remote_parse <- function(x) {
  if (length(x) == 0) {
    return(NULL)
  }
  if (!grepl("github", x)) {
    return(NULL)
  }

  # https://github.com/hadley/devtools.git
  # https://github.com/hadley/devtools
  # git@github.com:hadley/devtools.git
  re <- "github[^/:]*[/:]([^/]+)/(.*?)(?:\\.git)?$"
  m <- regexec(re, x)
  match <- regmatches(x, m)[[1]]

  if (length(match) == 0) {
    return(NULL)
  }

  list(
    username = match[2],
    repo = match[3]
  )
}

git_remotes <- function(path = ".") {
  conf <- git_config(path)
  remotes <- conf[grepl("^remote", names(conf))]

  remotes <- discard(remotes, function(x) is.null(x$url))
  urls <- vapply(remotes, "[[", "url", FUN.VALUE = character(1))

  names(urls) <- gsub('^remote "(.*?)"$', "\\1", names(remotes))
  urls
}


git_config <- function(path = ".") {
  config_path <- file.path(repo_root(path), ".git", "config")
  if (!file.exists(config_path)) {
    cli::cli_abort("git config does not exist at {.path {path}}")
  }
  ini::read.ini(config_path, "UTF-8")
}

repo_root <- function(path = ".") {
  if (!file.exists(path)) {
    cli::cli_abort("Can't find repo at {.path {path}}")
  }

  # Walk up to root directory
  while (!has_git(path)) {
    if (is_root(path)) {
      cli::cli_abort("Could not find git root from {.path {path}}.")
    }

    path <- dirname(path)
  }

  path
}

has_git <- function(path) {
  file.exists(file.path(path, ".git"))
}

is_root <- function(path) {
  identical(path, dirname(path))
}


================================================
FILE: R/import-standalone-purrr.R
================================================
# Standalone file: do not edit by hand
# Source: <https://github.com/r-lib/rlang/blob/main/R/standalone-purrr.R>
# ----------------------------------------------------------------------
#
# ---
# repo: r-lib/rlang
# file: standalone-purrr.R
# last-updated: 2023-02-23
# license: https://unlicense.org
# imports: rlang
# ---
#
# This file provides a minimal shim to provide a purrr-like API on top of
# base R functions. They are not drop-in replacements but allow a similar style
# of programming.
#
# ## Changelog
#
# 2023-02-23:
# * Added `list_c()`
#
# 2022-06-07:
# * `transpose()` is now more consistent with purrr when inner names
#   are not congruent (#1346).
#
# 2021-12-15:
# * `transpose()` now supports empty lists.
#
# 2021-05-21:
# * Fixed "object `x` not found" error in `imap()` (@mgirlich)
#
# 2020-04-14:
# * Removed `pluck*()` functions
# * Removed `*_cpl()` functions
# * Used `as_function()` to allow use of `~`
# * Used `.` prefix for helpers
#
# nocov start

map <- function(.x, .f, ...) {
  .f <- as_function(.f, env = global_env())
  lapply(.x, .f, ...)
}
walk <- function(.x, .f, ...) {
  map(.x, .f, ...)
  invisible(.x)
}

map_lgl <- function(.x, .f, ...) {
  .rlang_purrr_map_mold(.x, .f, logical(1), ...)
}
map_int <- function(.x, .f, ...) {
  .rlang_purrr_map_mold(.x, .f, integer(1), ...)
}
map_dbl <- function(.x, .f, ...) {
  .rlang_purrr_map_mold(.x, .f, double(1), ...)
}
map_chr <- function(.x, .f, ...) {
  .rlang_purrr_map_mold(.x, .f, character(1), ...)
}
.rlang_purrr_map_mold <- function(.x, .f, .mold, ...) {
  .f <- as_function(.f, env = global_env())
  out <- vapply(.x, .f, .mold, ..., USE.NAMES = FALSE)
  names(out) <- names(.x)
  out
}

map2 <- function(.x, .y, .f, ...) {
  .f <- as_function(.f, env = global_env())
  out <- mapply(.f, .x, .y, MoreArgs = list(...), SIMPLIFY = FALSE)
  if (length(out) == length(.x)) {
    set_names(out, names(.x))
  } else {
    set_names(out, NULL)
  }
}
map2_lgl <- function(.x, .y, .f, ...) {
  as.vector(map2(.x, .y, .f, ...), "logical")
}
map2_int <- function(.x, .y, .f, ...) {
  as.vector(map2(.x, .y, .f, ...), "integer")
}
map2_dbl <- function(.x, .y, .f, ...) {
  as.vector(map2(.x, .y, .f, ...), "double")
}
map2_chr <- function(.x, .y, .f, ...) {
  as.vector(map2(.x, .y, .f, ...), "character")
}
imap <- function(.x, .f, ...) {
  map2(.x, names(.x) %||% seq_along(.x), .f, ...)
}

pmap <- function(.l, .f, ...) {
  .f <- as.function(.f)
  args <- .rlang_purrr_args_recycle(.l)
  do.call("mapply", c(
    FUN = list(quote(.f)),
    args, MoreArgs = quote(list(...)),
    SIMPLIFY = FALSE, USE.NAMES = FALSE
  ))
}
.rlang_purrr_args_recycle <- function(args) {
  lengths <- map_int(args, length)
  n <- max(lengths)

  stopifnot(all(lengths == 1L | lengths == n))
  to_recycle <- lengths == 1L
  args[to_recycle] <- map(args[to_recycle], function(x) rep.int(x, n))

  args
}

keep <- function(.x, .f, ...) {
  .x[.rlang_purrr_probe(.x, .f, ...)]
}
discard <- function(.x, .p, ...) {
  sel <- .rlang_purrr_probe(.x, .p, ...)
  .x[is.na(sel) | !sel]
}
map_if <- function(.x, .p, .f, ...) {
  matches <- .rlang_purrr_probe(.x, .p)
  .x[matches] <- map(.x[matches], .f, ...)
  .x
}
.rlang_purrr_probe <- function(.x, .p, ...) {
  if (is_logical(.p)) {
    stopifnot(length(.p) == length(.x))
    .p
  } else {
    .p <- as_function(.p, env = global_env())
    map_lgl(.x, .p, ...)
  }
}

compact <- function(.x) {
  Filter(length, .x)
}

transpose <- function(.l) {
  if (!length(.l)) {
    return(.l)
  }

  inner_names <- names(.l[[1]])

  if (is.null(inner_names)) {
    fields <- seq_along(.l[[1]])
  } else {
    fields <- set_names(inner_names)
    .l <- map(.l, function(x) {
      if (is.null(names(x))) {
        set_names(x, inner_names)
      } else {
        x
      }
    })
  }

  # This way missing fields are subsetted as `NULL` instead of causing
  # an error
  .l <- map(.l, as.list)

  map(fields, function(i) {
    map(.l, .subset2, i)
  })
}

every <- function(.x, .p, ...) {
  .p <- as_function(.p, env = global_env())

  for (i in seq_along(.x)) {
    if (!rlang::is_true(.p(.x[[i]], ...))) return(FALSE)
  }
  TRUE
}
some <- function(.x, .p, ...) {
  .p <- as_function(.p, env = global_env())

  for (i in seq_along(.x)) {
    if (rlang::is_true(.p(.x[[i]], ...))) return(TRUE)
  }
  FALSE
}
negate <- function(.p) {
  .p <- as_function(.p, env = global_env())
  function(...) !.p(...)
}

reduce <- function(.x, .f, ..., .init) {
  f <- function(x, y) .f(x, y, ...)
  Reduce(f, .x, init = .init)
}
reduce_right <- function(.x, .f, ..., .init) {
  f <- function(x, y) .f(y, x, ...)
  Reduce(f, .x, init = .init, right = TRUE)
}
accumulate <- function(.x, .f, ..., .init) {
  f <- function(x, y) .f(x, y, ...)
  Reduce(f, .x, init = .init, accumulate = TRUE)
}
accumulate_right <- function(.x, .f, ..., .init) {
  f <- function(x, y) .f(y, x, ...)
  Reduce(f, .x, init = .init, right = TRUE, accumulate = TRUE)
}

detect <- function(.x, .f, ..., .right = FALSE, .p = is_true) {
  .p <- as_function(.p, env = global_env())
  .f <- as_function(.f, env = global_env())

  for (i in .rlang_purrr_index(.x, .right)) {
    if (.p(.f(.x[[i]], ...))) {
      return(.x[[i]])
    }
  }
  NULL
}
detect_index <- function(.x, .f, ..., .right = FALSE, .p = is_true) {
  .p <- as_function(.p, env = global_env())
  .f <- as_function(.f, env = global_env())

  for (i in .rlang_purrr_index(.x, .right)) {
    if (.p(.f(.x[[i]], ...))) {
      return(i)
    }
  }
  0L
}
.rlang_purrr_index <- function(x, right = FALSE) {
  idx <- seq_along(x)
  if (right) {
    idx <- rev(idx)
  }
  idx
}

list_c <- function(x) {
  inject(c(!!!x))
}

# nocov end


================================================
FILE: R/pagination.R
================================================
extract_link <- function(gh_response, link) {
  headers <- attr(gh_response, "response")
  links <- headers$link
  if (is.null(links)) {
    return(NA_character_)
  }
  links <- trim_ws(strsplit(links, ",")[[1]])
  link_list <- lapply(links, function(x) {
    x <- trim_ws(strsplit(x, ";")[[1]])
    name <- sub("^.*\"(.*)\".*$", "\\1", x[2])
    value <- sub("^<(.*)>$", "\\1", x[1])
    c(name, value)
  })
  link_list <- structure(
    vapply(link_list, "[", "", 2),
    names = vapply(link_list, "[", "", 1)
  )

  if (link %in% names(link_list)) {
    link_list[[link]]
  } else {
    NA_character_
  }
}

gh_has <- function(gh_response, link) {
  url <- extract_link(gh_response, link)
  !is.na(url)
}

gh_has_next <- function(gh_response) {
  gh_has(gh_response, "next")
}

gh_link_request <- function(gh_response, link, .token, .send_headers) {
  stopifnot(inherits(gh_response, "gh_response"))

  url <- extract_link(gh_response, link)
  if (is.na(url)) cli::cli_abort("No {link} page")

  req <- attr(gh_response, "request")
  req$url <- url
  req$token <- .token
  req$send_headers <- .send_headers
  req <- gh_set_headers(req)
  req
}

gh_link <- function(gh_response, link, .token, .send_headers) {
  req <- gh_link_request(gh_response, link, .token, .send_headers)
  raw <- gh_make_request(req)
  gh_process_response(raw, req)
}

gh_extract_pages <- function(gh_response) {
  last <- extract_link(gh_response, "last")
  if (!is.na(last)) {
    as.integer(httr2::url_parse(last)$query$page)
  } else {
    NA
  }
}

#' Get the next, previous, first or last page of results
#'
#' @details
#' Note that these are not always defined. E.g. if the first
#' page was queried (the default), then there are no first and previous
#' pages defined. If there is no next page, then there is no
#' next page defined, etc.
#'
#' If the requested page does not exist, an error is thrown.
#'
#' @param gh_response An object returned by a [gh()] call.
#' @inheritParams gh
#' @return Answer from the API.
#'
#' @seealso The `.limit` argument to [gh()] supports fetching more than
#'   one page.
#'
#' @name gh_next
#' @export
#' @examplesIf identical(Sys.getenv("IN_PKGDOWN"), "true")
#' x <- gh("/users")
#' vapply(x, "[[", character(1), "login")
#' x2 <- gh_next(x)
#' vapply(x2, "[[", character(1), "login")
gh_next <- function(gh_response, .token = NULL, .send_headers = NULL) {
  gh_link(gh_response, "next", .token = .token, .send_headers = .send_headers)
}

#' @name gh_next
#' @export

gh_prev <- function(gh_response, .token = NULL, .send_headers = NULL) {
  gh_link(gh_response, "prev", .token = .token, .send_headers = .send_headers)
}

#' @name gh_next
#' @export

gh_first <- function(gh_response, .token = NULL, .send_headers = NULL) {
  gh_link(gh_response, "first", .token = .token, .send_headers = .send_headers)
}

#' @name gh_next
#' @export

gh_last <- function(gh_response, .token = NULL, .send_headers = NULL) {
  gh_link(gh_response, "last", .token = .token, .send_headers = .send_headers)
}


================================================
FILE: R/print.R
================================================
#' Print the result of a GitHub API call
#'
#' @param x The result object.
#' @param ... Ignored.
#' @return The JSON result.
#'
#' @importFrom jsonlite prettify toJSON
#' @export
#' @method print gh_response

print.gh_response <- function(x, ...) {
  if (inherits(x, c("raw", "path"))) {
    attributes(x) <- list(class = class(x))
    print.default(x)
  } else {
    print(toJSON(unclass(x), pretty = TRUE, auto_unbox = TRUE, force = TRUE))
  }
}


================================================
FILE: R/utils.R
================================================
trim_ws <- function(x) {
  sub("\\s*$", "", sub("^\\s*", "", x))
}

## from devtools, among other places
compact <- function(x) {
  is_empty <- vapply(x, function(x) length(x) == 0, logical(1))
  x[!is_empty]
}

## from purrr, among other places
`%||%` <- function(x, y) {
  if (is.null(x)) {
    y
  } else {
    x
  }
}

## as seen in purrr, with the name `has_names()`
has_name <- function(x) {
  nms <- names(x)
  if (is.null(nms)) {
    rep_len(FALSE, length(x))
  } else {
    !(is.na(nms) | nms == "")
  }
}

has_no_names <- function(x) all(!has_name(x))

## if all names are "", strip completely
cleanse_names <- function(x) {
  if (has_no_names(x)) {
    names(x) <- NULL
  }
  x
}

## to process HTTP headers, i.e. combine defaults w/ user-specified headers
## in the spirit of modifyList(), except
## x and y are vectors (not lists)
## name comparison is case insensitive
## http://www.w3.org/Protocols/rfc2616/rfc2616-sec4.html#sec4.2
## x will be default headers, y will be user-specified
modify_vector <- function(x, y = NULL) {
  if (length(y) == 0L) {
    return(x)
  }
  lnames <- function(x) tolower(names(x))
  c(x[!(lnames(x) %in% lnames(y))], y)
}


discard <- function(.x, .p, ...) {
  sel <- probe(.x, .p, ...)
  .x[is.na(sel) | !sel]
}
probe <- function(.x, .p, ...) {
  if (is.logical(.p)) {
    stopifnot(length(.p) == length(.x))
    .p
  } else {
    vapply(.x, .p, logical(1), ...)
  }
}

drop_named_nulls <- function(x) {
  if (has_no_names(x)) {
    return(x)
  }
  named <- has_name(x)
  null <- vapply(x, is.null, logical(1))
  cleanse_names(x[!named | !null])
}

.parse_params <- function(..., .params = list()) {
  params <- c(list2(...), .params)
  drop_named_nulls(params)
}

check_named_nas <- function(x) {
  if (has_no_names(x)) {
    return(x)
  }
  named <- has_name(x)
  na <- vapply(x, FUN.VALUE = logical(1), function(v) {
    is.atomic(v) && anyNA(v)
  })
  bad <- which(named & na)
  if (length(bad)) {
    str <- paste0("`", names(x)[bad], "`", collapse = ", ")
    stop("Named NA parameters are not allowed: ", str)
  }
}

can_load <- function(pkg) {
  isTRUE(requireNamespace(pkg, quietly = TRUE))
}

is_interactive <- function() {
  opt <- getOption("rlib_interactive")
  if (isTRUE(opt)) {
    TRUE
  } else if (identical(opt, FALSE)) {
    FALSE
  } else if (tolower(getOption("knitr.in.progress", "false")) == "true") {
    FALSE
  } else if (identical(Sys.getenv("TESTTHAT"), "true")) {
    FALSE
  } else {
    interactive()
  }
}

is_testing <- function() {
  identical(Sys.getenv("TESTTHAT"), "true")
}


================================================
FILE: README.Rmd
================================================
---
output: github_document
---

<!-- README.md is generated from README.Rmd. Please edit that file -->

```{r}
#| label: setup
#| include: false
knitr::opts_chunk$set(
  collapse = TRUE,
  comment = "#>",
  fig.path = "man/figures/README-",
  out.width = "100%"
)
```

# gh

<!-- badges: start -->
[![R-CMD-check](https://github.com/r-lib/gh/workflows/R-CMD-check/badge.svg)](https://github.com/r-lib/gh/actions)
[![](https://www.r-pkg.org/badges/version/gh)](https://www.r-pkg.org/pkg/gh)
[![CRAN Posit mirror downloads](https://cranlogs.r-pkg.org/badges/gh)](https://www.r-pkg.org/pkg/gh)
[![R-CMD-check](https://github.com/r-lib/gh/actions/workflows/R-CMD-check.yaml/badge.svg)](https://github.com/r-lib/gh/actions/workflows/R-CMD-check.yaml)
[![Codecov test coverage](https://codecov.io/gh/r-lib/gh/graph/badge.svg)](https://app.codecov.io/gh/r-lib/gh)
<!-- badges: end -->

Minimalistic client to access GitHub's
[REST](https://docs.github.com/rest) and [GraphQL](https://docs.github.com/graphql) APIs.

## Installation and setup

Install the package from CRAN as usual:

```{r}
#| eval: false
install.packages("gh")
```

Install the development version from GitHub:

```{r}
#| eval: false
pak::pak("r-lib/gh")
```

### Authentication

The value returned by `gh::gh_token()` is used as Personal Access Token
(PAT). A token is needed for some requests, and to help with rate limiting.
gh can use your regular git credentials in the git credential store, via
the gitcreds package. Use `gitcreds::gitcreds_set()` to put a PAT into the
git credential store. If you cannot use the credential store, set the
`GITHUB_PAT` environment variable to your PAT. See the details in the
`?gh::gh_token` manual page and the manual of the gitcreds package.

### API URL

* The `GITHUB_API_URL` environment variable, if set, is used for the default github api url.

## Usage

```{r}
library(gh)
```

Use the `gh()` function to access all API endpoints. The endpoints are
listed in the [documentation](https://docs.github.com/rest).

The first argument of `gh()` is the endpoint. You can just copy and paste the
API endpoints from the documentation. Note that the leading slash
must be included as well.

From <https://docs.github.com/rest/reference/repos#list-repositories-for-a-user> you can copy and paste `GET /users/{username}/repos` into your `gh()`
call. E.g.

```{r}
my_repos <- gh("GET /users/{username}/repos", username = "gaborcsardi")
vapply(my_repos, "[[", "", "name")
```

The JSON result sent by the API is converted to an R object.

Parameters can be passed as extra arguments. E.g.

```{r}
my_repos <- gh(
  "/users/{username}/repos",
  username = "gaborcsardi",
  sort = "created")
vapply(my_repos, "[[", "", "name")
```

### POST, PATCH, PUT and DELETE requests

POST, PATCH, PUT, and DELETE requests can be sent by including the
HTTP verb before the endpoint, in the first argument. E.g. to
create a repository:

```{r}
#| eval: false
new_repo <- gh("POST /user/repos", name = "my-new-repo-for-gh-testing")
```

and then delete it:

```{r}
#| eval: false
gh("DELETE /repos/{owner}/{repo}", owner = "gaborcsardi",
   repo = "my-new-repo-for-gh-testing")
```

### Tokens

By default the `GITHUB_PAT` environment variable is used. Alternatively,
one can set the `.token` argument of `gh()`.

### Pagination

Supply the `page` parameter to get subsequent pages:

```{r}
my_repos2 <- gh("GET /orgs/{org}/repos", org = "r-lib", page = 2)
vapply(my_repos2, "[[", "", "name")
```

## Environment Variables

* The `GITHUB_API_URL` environment variable is used for the default github
  api url.
* The `GITHUB_PAT` and `GITHUB_TOKEN` environment variables are used, if
  set, in this order, as default token. Consider using the git credential
  store instead, see `?gh::gh_token`.

## Code of Conduct

Please note that the gh project is released with a
[Contributor Code of Conduct](https://gh.r-lib.org/CODE_OF_CONDUCT.html).
By contributing to this project, you agree to abide by its terms.

## License

MIT © Gábor Csárdi, Jennifer Bryan, Hadley Wickham


================================================
FILE: README.md
================================================

<!-- README.md is generated from README.Rmd. Please edit that file -->

# gh

<!-- badges: start -->

[![R-CMD-check](https://github.com/r-lib/gh/workflows/R-CMD-check/badge.svg)](https://github.com/r-lib/gh/actions)
[![](https://www.r-pkg.org/badges/version/gh)](https://www.r-pkg.org/pkg/gh)
[![CRAN Posit mirror
downloads](https://cranlogs.r-pkg.org/badges/gh)](https://www.r-pkg.org/pkg/gh)
[![R-CMD-check](https://github.com/r-lib/gh/actions/workflows/R-CMD-check.yaml/badge.svg)](https://github.com/r-lib/gh/actions/workflows/R-CMD-check.yaml)
[![Codecov test
coverage](https://codecov.io/gh/r-lib/gh/graph/badge.svg)](https://app.codecov.io/gh/r-lib/gh)
<!-- badges: end -->

Minimalistic client to access GitHub’s
[REST](https://docs.github.com/rest) and
[GraphQL](https://docs.github.com/graphql) APIs.

## Installation and setup

Install the package from CRAN as usual:

``` r
install.packages("gh")
```

Install the development version from GitHub:

``` r
pak::pak("r-lib/gh")
```

### Authentication

The value returned by `gh::gh_token()` is used as Personal Access Token
(PAT). A token is needed for some requests, and to help with rate
limiting. gh can use your regular git credentials in the git credential
store, via the gitcreds package. Use `gitcreds::gitcreds_set()` to put a
PAT into the git credential store. If you cannot use the credential
store, set the `GITHUB_PAT` environment variable to your PAT. See the
details in the `?gh::gh_token` manual page and the manual of the
gitcreds package.

### API URL

-   The `GITHUB_API_URL` environment variable, if set, is used for the
    default github api url.

## Usage

``` r
library(gh)
```

Use the `gh()` function to access all API endpoints. The endpoints are
listed in the [documentation](https://docs.github.com/rest).

The first argument of `gh()` is the endpoint. You can just copy and
paste the API endpoints from the documentation. Note that the leading
slash must be included as well.

From
<https://docs.github.com/rest/reference/repos#list-repositories-for-a-user>
you can copy and paste `GET /users/{username}/repos` into your `gh()`
call. E.g.

``` r
my_repos <- gh("GET /users/{username}/repos", username = "gaborcsardi")
vapply(my_repos, "[[", "", "name")
#>  [1] "after"                "alda"                 "alexr"               
#>  [4] "all.primer.tutorials" "altlist"              "anticlust"           
#>  [7] "argufy"               "ask"                  "async"               
#> [10] "autobrew-bundler"     "available-work"       "baguette"            
#> [13] "BCEA"                 "BH"                   "bigrquerystorage"    
#> [16] "brew-big-sur"         "brokenPackage"        "brulee"              
#> [19] "build-r-app"          "butcher"              "censored"            
#> [22] "cf-tunnel"            "checkinstall"         "cli"                 
#> [25] "clock"                "comments"             "covr"                
#> [28] "covrlabs"             "cran-metadata"        "csg"
```

The JSON result sent by the API is converted to an R object.

Parameters can be passed as extra arguments. E.g.

``` r
my_repos <- gh(
  "/users/{username}/repos",
  username = "gaborcsardi",
  sort = "created")
vapply(my_repos, "[[", "", "name")
#>  [1] "phantomjs"       "FSA"             "greta"           "webdriver"      
#>  [5] "clock"           "testthat"        "jsonlite"        "duckdb"         
#>  [9] "duckdb-r"        "httpuv"          "unwind"          "httr2"          
#> [13] "pins-r"          "install-figlet"  "weird-package"   "anticlust"      
#> [17] "nanoparquet-cli" "cf-tunnel"       "myweek"          "figlet"         
#> [21] "evercran"        "available-work"  "r-shell"         "Rcpp"           
#> [25] "openssl"         "openbsd-vm"      "cran-metadata"   "run-r-app"      
#> [29] "build-r-app"     "comments"
```

### POST, PATCH, PUT and DELETE requests

POST, PATCH, PUT, and DELETE requests can be sent by including the HTTP
verb before the endpoint, in the first argument. E.g. to create a
repository:

``` r
new_repo <- gh("POST /user/repos", name = "my-new-repo-for-gh-testing")
```

and then delete it:

``` r
gh("DELETE /repos/{owner}/{repo}", owner = "gaborcsardi",
   repo = "my-new-repo-for-gh-testing")
```

### Tokens

By default the `GITHUB_PAT` environment variable is used. Alternatively,
one can set the `.token` argument of `gh()`.

### Pagination

Supply the `page` parameter to get subsequent pages:

``` r
my_repos2 <- gh("GET /orgs/{org}/repos", org = "r-lib", page = 2)
vapply(my_repos2, "[[", "", "name")
#>  [1] "desc"        "profvis"     "sodium"      "gargle"      "remotes"    
#>  [6] "jose"        "backports"   "rcmdcheck"   "vdiffr"      "callr"      
#> [11] "mockery"     "here"        "revdepcheck" "processx"    "vctrs"      
#> [16] "debugme"     "usethis"     "rlang"       "pkgload"     "httrmock"   
#> [21] "pkgbuild"    "prettycode"  "roxygen2md"  "pkgapi"      "zeallot"    
#> [26] "liteq"       "keyring"     "sloop"       "styler"      "ansistrings"
```

## Environment Variables

-   The `GITHUB_API_URL` environment variable is used for the default
    github api url.
-   The `GITHUB_PAT` and `GITHUB_TOKEN` environment variables are used,
    if set, in this order, as default token. Consider using the git
    credential store instead, see `?gh::gh_token`.

## Code of Conduct

Please note that the gh project is released with a [Contributor Code of
Conduct](https://gh.r-lib.org/CODE_OF_CONDUCT.html). By contributing to
this project, you agree to abide by its terms.

## License

MIT © Gábor Csárdi, Jennifer Bryan, Hadley Wickham


================================================
FILE: _pkgdown.yml
================================================
url: https://gh.r-lib.org

template:
  package: tidytemplate
  bootstrap: 5
  includes:
    in_header: |
      <script src="https://cdn.jsdelivr.net/gh/posit-dev/supported-by-posit/js/badge.min.js" data-max-height="43" data-light-bg="#666f76" data-light-fg="#f9f9f9"></script>
      <script defer data-domain="gh.r-lib.org,all.tidyverse.org" src="https://plausible.io/js/plausible.js"></script>


development:
  mode: auto


================================================
FILE: air.toml
================================================


================================================
FILE: codecov.yml
================================================
comment: false

coverage:
  status:
    project:
      default:
        target: auto
        threshold: 1%
        informational: true
    patch:
      default:
        target: auto
        threshold: 1%
        informational: true


================================================
FILE: gh.Rproj
================================================
Version: 1.0

RestoreWorkspace: No
SaveWorkspace: No
AlwaysSaveHistory: Default

EnableCodeIndexing: Yes
UseSpacesForTab: Yes
NumSpacesForTab: 2
Encoding: UTF-8

RnwWeave: knitr
LaTeX: XeLaTeX

AutoAppendNewline: Yes
StripTrailingWhitespace: Yes

BuildType: Package
PackageUseDevtools: Yes
PackageInstallArgs: --no-multiarch --with-keep.source
PackageRoxygenize: rd,collate,namespace


================================================
FILE: inst/WORDLIST
================================================
CMD
Codecov
Github
GraphQL
JSON
LastPass
Minimalistic
PATs
PBC
PSA
ROR
URI
api
auth
discoverable
funder
gitcreds
github
httr
keyring
macOS
pre
programmatically
repo
usethis
wc


================================================
FILE: man/gh-package.Rd
================================================
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/gh-package.R
\docType{package}
\name{gh-package}
\alias{gh-package}
\title{gh: 'GitHub' 'API'}
\description{
Minimal client to access the 'GitHub' 'API'.
}
\seealso{
Useful links:
\itemize{
  \item \url{https://gh.r-lib.org/}
  \item \url{https://github.com/r-lib/gh#readme}
  \item Report bugs at \url{https://github.com/r-lib/gh/issues}
}

}
\author{
\strong{Maintainer}: Gábor Csárdi \email{csardi.gabor@gmail.com} [contributor]

Authors:
\itemize{
  \item Jennifer Bryan
  \item Hadley Wickham
}

Other contributors:
\itemize{
  \item Posit Software, PBC (03wc8by49) [copyright holder, funder]
}

}
\keyword{internal}


================================================
FILE: man/gh.Rd
================================================
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/gh.R
\name{gh}
\alias{gh}
\title{Query the GitHub API}
\usage{
gh(
  endpoint,
  ...,
  per_page = NULL,
  .per_page = NULL,
  .token = NULL,
  .destfile = NULL,
  .overwrite = FALSE,
  .api_url = NULL,
  .method = "GET",
  .limit = NULL,
  .accept = "application/vnd.github.v3+json",
  .send_headers = NULL,
  .progress = TRUE,
  .params = list(),
  .max_wait = 600,
  .max_rate = NULL
)
}
\arguments{
\item{endpoint}{GitHub API endpoint. Must be one of the following forms:
\itemize{
\item \verb{METHOD path}, e.g. \code{GET /rate_limit},
\item \code{path}, e.g. \verb{/rate_limit},
\item \verb{METHOD url}, e.g. \verb{GET https://api.github.com/rate_limit},
\item \code{url}, e.g. \verb{https://api.github.com/rate_limit}.
}

If the method is not supplied, will use \code{.method}, which defaults
to \code{"GET"}.}

\item{...}{Name-value pairs giving API parameters. Will be matched into
\code{endpoint} placeholders, sent as query parameters in GET requests, and as a
JSON body of POST requests. If there is only one unnamed parameter, and it
is a raw vector, then it will not be JSON encoded, but sent as raw data, as
is. This can be used for example to add assets to releases. Named \code{NULL}
values are silently dropped. For GET requests, named \code{NA} values trigger an
error. For other methods, named \code{NA} values are included in the body of the
request, as JSON \code{null}.}

\item{per_page, .per_page}{Number of items to return per page. If omitted,
will be substituted by \code{max(.limit, 100)} if \code{.limit} is set,
otherwise determined by the API (never greater than 100).}

\item{.token}{Authentication token. Defaults to \code{\link[=gh_token]{gh_token()}}.}

\item{.destfile}{Path to write response to disk. If \code{NULL} (default),
response will be processed and returned as an object. If path is given,
response will be written to disk in the form sent. gh writes the
response to a temporary file, and renames that file to \code{.destfile}
after the request was successful. The name of the temporary file is
created by adding a \verb{-<random>.gh-tmp} suffix to it, where \verb{<random>}
is an ASCII string with random characters. gh removes the temporary
file on error.}

\item{.overwrite}{If \code{.destfile} is provided, whether to overwrite an
existing file.  Defaults to \code{FALSE}. If an error happens the original
file is kept.}

\item{.api_url}{Github API url (default: \url{https://api.github.com}). Used
if \code{endpoint} just contains a path. Defaults to \code{GITHUB_API_URL}
environment variable if set.}

\item{.method}{HTTP method to use if not explicitly supplied in the
\code{endpoint}.}

\item{.limit}{Number of records to return. This can be used
instead of manual pagination. By default it is \code{NULL},
which means that the defaults of the GitHub API are used.
You can set it to a number to request more (or less)
records, and also to \code{Inf} to request all records.
Note, that if you request many records, then multiple GitHub
API calls are used to get them, and this can take a potentially
long time.}

\item{.accept}{The value of the \code{Accept} HTTP header. Defaults to
\code{"application/vnd.github.v3+json"} . If \code{Accept} is given in
\code{.send_headers}, then that will be used. This parameter can be used to
provide a custom media type, in order to access a preview feature of
the API.}

\item{.send_headers}{Named character vector of header field values
(except \code{Authorization}, which is handled via \code{.token}). This can be
used to override or augment the default \code{User-Agent} header:
\code{"https://github.com/r-lib/gh"}.}

\item{.progress}{Whether to show a progress indicator for calls that
need more than one HTTP request.}

\item{.params}{Additional list of parameters to append to \code{...}.
It is easier to use this than \code{...} if you have your parameters in
a list already.}

\item{.max_wait}{Maximum number of seconds to wait if rate limited.
Defaults to 10 minutes.}

\item{.max_rate}{Maximum request rate in requests per second. Set
this to automatically throttle requests.}
}
\value{
Answer from the API as a \code{gh_response} object, which is also a
\code{list}. Failed requests will generate an R error. Requests that
generate a raw response will return a raw vector.
}
\description{
This is an extremely minimal client. You need to know the API
to be able to use this client. All this function does is:
\itemize{
\item Try to substitute each listed parameter into \code{endpoint}, using the
\code{{parameter}} notation.
\item If a GET request (the default), then add all other listed parameters
as query parameters.
\item If not a GET request, then send the other parameters in the request
body, as JSON.
\item Convert the response to an R list using \code{\link[jsonlite:fromJSON]{jsonlite::fromJSON()}}.
}
}
\examples{
\dontshow{if (identical(Sys.getenv("IN_PKGDOWN"), "true")) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}
## Repositories of a user, these are equivalent
gh("/users/hadley/repos", .limit = 2)
gh("/users/{username}/repos", username = "hadley", .limit = 2)

## Starred repositories of a user
gh("/users/hadley/starred", .limit = 2)
gh("/users/{username}/starred", username = "hadley", .limit = 2)
\dontshow{\}) # examplesIf}
\dontshow{if (FALSE) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}
## Create a repository, needs a token (see gh_token())
gh("POST /user/repos", name = "foobar")
\dontshow{\}) # examplesIf}
\dontshow{if (identical(Sys.getenv("IN_PKGDOWN"), "true")) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}
## Issues of a repository
gh("/repos/hadley/dplyr/issues")
gh("/repos/{owner}/{repo}/issues", owner = "hadley", repo = "dplyr")

## Automatic pagination
users <- gh("/users", .limit = 50)
length(users)
\dontshow{\}) # examplesIf}
\dontshow{if (FALSE) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}
## Access developer preview of Licenses API (in preview as of 2015-09-24)
gh("/licenses") # used to error code 415
gh("/licenses", .accept = "application/vnd.github.drax-preview+json")
\dontshow{\}) # examplesIf}
\dontshow{if (FALSE) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}
## Access Github Enterprise API
## Use GITHUB_API_URL environment variable to change the default.
gh("/user/repos", type = "public", .api_url = "https://github.foobar.edu/api/v3")
\dontshow{\}) # examplesIf}
\dontshow{if (FALSE) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}
## Use I() to force body part to be sent as an array, even if length 1
## This works whether assignees has length 1 or > 1
assignees <- "gh_user"
assignees <- c("gh_user1", "gh_user2")
gh("PATCH /repos/OWNER/REPO/issues/1", assignees = I(assignees))
\dontshow{\}) # examplesIf}
\dontshow{if (FALSE) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}
## There are two ways to send JSON data. One is that you supply one or
## more objects that will be converted to JSON automatically via
## jsonlite::toJSON(). In this case sometimes you need to use
## jsonlite::unbox() because fromJSON() creates lists from scalar vectors
## by default. The Content-Type header is automatically added in this
## case. For example this request turns on GitHub Pages, using this
## API: https://docs.github.com/v3/repos/pages/#enable-a-pages-site

gh::gh(
  "POST /repos/{owner}/{repo}/pages",
  owner = "r-lib",
  repo = "gh",
  source = list(
    branch = jsonlite::unbox("gh-pages"),
    path = jsonlite::unbox("/")
  ),
  .send_headers = c(Accept = "application/vnd.github.switcheroo-preview+json")
)

## The second way is to handle the JSON encoding manually, and supply it
## as a raw vector in an unnamed argument, and also a Content-Type header:

body <- '{ "source": { "branch": "gh-pages", "path": "/" } }'
gh::gh(
  "POST /repos/{owner}/{repo}/pages",
  owner = "r-lib",
  repo = "gh",
  charToRaw(body),
  .send_headers = c(
    Accept = "application/vnd.github.switcheroo-preview+json",
    "Content-Type" = "application/json"
  )
)
\dontshow{\}) # examplesIf}
\dontshow{if (FALSE) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}
## Pass along a query to the search/code endpoint via the ... argument
x <- gh::gh(
            "/search/code",
            q = "installation repo:r-lib/gh",
            .send_headers = c("X-GitHub-Api-Version" = "2022-11-28")
            )
 str(x, list.len = 3, give.attr = FALSE)

\dontshow{\}) # examplesIf}
}
\seealso{
\code{\link[=gh_gql]{gh_gql()}} if you want to use the GitHub GraphQL API,
\code{\link[=gh_whoami]{gh_whoami()}} for details on GitHub API token management.
}


================================================
FILE: man/gh_gql.Rd
================================================
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/gh_gql.R
\name{gh_gql}
\alias{gh_gql}
\title{A simple interface for the GitHub GraphQL API v4.}
\usage{
gh_gql(query, ...)
}
\arguments{
\item{query}{The GraphQL query, as a string.}

\item{...}{Name-value pairs giving API parameters. Will be matched into
\code{endpoint} placeholders, sent as query parameters in GET requests, and as a
JSON body of POST requests. If there is only one unnamed parameter, and it
is a raw vector, then it will not be JSON encoded, but sent as raw data, as
is. This can be used for example to add assets to releases. Named \code{NULL}
values are silently dropped. For GET requests, named \code{NA} values trigger an
error. For other methods, named \code{NA} values are included in the body of the
request, as JSON \code{null}.}
}
\description{
See more about the GraphQL API here:
\url{https://docs.github.com/graphql}
}
\details{
Note: pagination and the \code{.limit} argument does not work currently,
as pagination in the GraphQL API is different from the v3 API.
If you need pagination with GraphQL, you'll need to do that manually.
}
\examples{
\dontshow{if (FALSE) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}
gh_gql("query { viewer { login }}")

# Get rate limit
ratelimit_query <- "query {
  viewer {
    login
  }
  rateLimit {
    limit
    cost
    remaining
    resetAt
  }
}"

gh_gql(ratelimit_query)
\dontshow{\}) # examplesIf}
}
\seealso{
\code{\link[=gh]{gh()}} for the GitHub v3 API.
}


================================================
FILE: man/gh_next.Rd
================================================
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/pagination.R
\name{gh_next}
\alias{gh_next}
\alias{gh_prev}
\alias{gh_first}
\alias{gh_last}
\title{Get the next, previous, first or last page of results}
\usage{
gh_next(gh_response, .token = NULL, .send_headers = NULL)

gh_prev(gh_response, .token = NULL, .send_headers = NULL)

gh_first(gh_response, .token = NULL, .send_headers = NULL)

gh_last(gh_response, .token = NULL, .send_headers = NULL)
}
\arguments{
\item{gh_response}{An object returned by a \code{\link[=gh]{gh()}} call.}

\item{.token}{Authentication token. Defaults to \code{\link[=gh_token]{gh_token()}}.}

\item{.send_headers}{Named character vector of header field values
(except \code{Authorization}, which is handled via \code{.token}). This can be
used to override or augment the default \code{User-Agent} header:
\code{"https://github.com/r-lib/gh"}.}
}
\value{
Answer from the API.
}
\description{
Get the next, previous, first or last page of results
}
\details{
Note that these are not always defined. E.g. if the first
page was queried (the default), then there are no first and previous
pages defined. If there is no next page, then there is no
next page defined, etc.

If the requested page does not exist, an error is thrown.
}
\examples{
\dontshow{if (identical(Sys.getenv("IN_PKGDOWN"), "true")) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}
x <- gh("/users")
vapply(x, "[[", character(1), "login")
x2 <- gh_next(x)
vapply(x2, "[[", character(1), "login")
\dontshow{\}) # examplesIf}
}
\seealso{
The \code{.limit} argument to \code{\link[=gh]{gh()}} supports fetching more than
one page.
}


================================================
FILE: man/gh_rate_limit.Rd
================================================
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/gh_rate_limit.R
\name{gh_rate_limit}
\alias{gh_rate_limit}
\alias{gh_rate_limits}
\title{Return GitHub user's current rate limits}
\usage{
gh_rate_limit(
  response = NULL,
  .token = NULL,
  .api_url = NULL,
  .send_headers = NULL
)

gh_rate_limits(.token = NULL, .api_url = NULL, .send_headers = NULL)
}
\arguments{
\item{response}{\code{gh_response} object from a previous \code{gh} call, rate
limit values are determined from values in the response header.
Optional argument, if missing a call to "GET /rate_limit" will be made.}

\item{.token}{Authentication token. Defaults to \code{\link[=gh_token]{gh_token()}}.}

\item{.api_url}{Github API url (default: \url{https://api.github.com}). Used
if \code{endpoint} just contains a path. Defaults to \code{GITHUB_API_URL}
environment variable if set.}

\item{.send_headers}{Named character vector of header field values
(except \code{Authorization}, which is handled via \code{.token}). This can be
used to override or augment the default \code{User-Agent} header:
\code{"https://github.com/r-lib/gh"}.}
}
\value{
A \code{list} object containing the overall \code{limit}, \code{remaining} limit, and the
limit \code{reset} time.
}
\description{
\code{gh_rate_limits()} reports on all rate limits for the authenticated user.
\code{gh_rate_limit()} reports on rate limits for previous successful request.

Further details on GitHub's API rate limit policies are available at
\url{https://docs.github.com/v3/#rate-limiting}.
}


================================================
FILE: man/gh_token.Rd
================================================
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/gh_token.R
\name{gh_token}
\alias{gh_token}
\alias{gh_token_exists}
\title{Return the local user's GitHub Personal Access Token (PAT)}
\usage{
gh_token(api_url = NULL)

gh_token_exists(api_url = NULL)
}
\arguments{
\item{api_url}{GitHub API URL. Defaults to the \code{GITHUB_API_URL} environment
variable, if set, and otherwise to \url{https://api.github.com}.}
}
\value{
A string of characters, if a PAT is found, or the empty
string, otherwise. For convenience, the return value has an S3 class in
order to ensure that simple printing strategies don't reveal the entire
PAT.
}
\description{
If gh can find a personal access token (PAT) via \code{gh_token()}, it includes
the PAT in its requests. Some requests succeed without a PAT, but many
require a PAT to prove the request is authorized by a specific GitHub user. A
PAT also helps with rate limiting. If your gh use is more than casual, you
want a PAT.

gh calls \code{\link[gitcreds:gitcreds_get]{gitcreds::gitcreds_get()}} with the \code{api_url}, which checks session
environment variables (\code{GITHUB_PAT}, \code{GITHUB_TOKEN})
and then the local Git credential store for a PAT
appropriate to the \code{api_url}. Therefore, if you have previously used a PAT
with, e.g., command line Git, gh may retrieve and re-use it. You can call
\code{\link[gitcreds:gitcreds_get]{gitcreds::gitcreds_get()}} directly, yourself, if you want to see what is
found for a specific URL. If no matching PAT is found,
\code{\link[gitcreds:gitcreds_get]{gitcreds::gitcreds_get()}} errors, whereas \code{gh_token()} does not and,
instead, returns \code{""}.

See GitHub's documentation on \href{https://docs.github.com/en/github/authenticating-to-github/creating-a-personal-access-token}{Creating a personal access token},
or use \code{usethis::create_github_token()} for a guided experience, including
pre-selection of recommended scopes. Once you have a PAT, you can use
\code{\link[gitcreds:gitcreds_get]{gitcreds::gitcreds_set()}} to add it to the Git credential store. From that
point on, gh (via \code{\link[gitcreds:gitcreds_get]{gitcreds::gitcreds_get()}}) should be able to find it
without further effort on your part.
}
\examples{
\dontrun{
gh_token()

format(gh_token())

str(gh_token())
}
}


================================================
FILE: man/gh_tree_remote.Rd
================================================
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/git.R
\name{gh_tree_remote}
\alias{gh_tree_remote}
\title{Find the GitHub remote associated with a path}
\usage{
gh_tree_remote(path = ".")
}
\arguments{
\item{path}{Path that is contained within a git repo.}
}
\value{
If the repo has a github remote, a list containing \code{username}
and \code{repo}. Otherwise, an error.
}
\description{
This is handy helper if you want to make gh requests related to the
current project.
}
\examples{
\dontshow{if (interactive()) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}
gh_tree_remote()
\dontshow{\}) # examplesIf}
}


================================================
FILE: man/gh_whoami.Rd
================================================
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/gh_whoami.R
\name{gh_whoami}
\alias{gh_whoami}
\title{Info on current GitHub user and token}
\usage{
gh_whoami(.token = NULL, .api_url = NULL, .send_headers = NULL)
}
\arguments{
\item{.token}{Authentication token. Defaults to \code{\link[=gh_token]{gh_token()}}.}

\item{.api_url}{Github API url (default: \url{https://api.github.com}). Used
if \code{endpoint} just contains a path. Defaults to \code{GITHUB_API_URL}
environment variable if set.}

\item{.send_headers}{Named character vector of header field values
(except \code{Authorization}, which is handled via \code{.token}). This can be
used to override or augment the default \code{User-Agent} header:
\code{"https://github.com/r-lib/gh"}.}
}
\value{
A \code{gh_response} object, which is also a \code{list}.
}
\description{
Reports wallet name, GitHub login, and GitHub URL for the current
authenticated user, the first bit of the token, and the associated scopes.
}
\details{
Get a personal access token for the GitHub API from
\url{https://github.com/settings/tokens} and select the scopes necessary for your
planned tasks. The \code{repo} scope, for example, is one many are likely to need.

On macOS and Windows it is best to store the token in the git credential
store, where most GitHub clients, including gh, can access it. You can
use the gitcreds package to add your token to the credential store:

\if{html}{\out{<div class="sourceCode r">}}\preformatted{gitcreds::gitcreds_set()
}\if{html}{\out{</div>}}

See \url{https://gh.r-lib.org/articles/managing-personal-access-tokens.html}
and \url{https://usethis.r-lib.org/articles/articles/git-credentials.html}
for more about managing GitHub (and generic git) credentials.

On other systems, including Linux, the git credential store is
typically not as convenient, and you might want to store your token in
the \code{GITHUB_PAT} environment variable, which you can set in your
\code{.Renviron} file.
}
\examples{
\dontshow{if (identical(Sys.getenv("IN_PKGDOWN"), "true")) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}
gh_whoami()
\dontshow{\}) # examplesIf}
\dontshow{if (FALSE) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}
## explicit token + use with GitHub Enterprise
gh_whoami(
  .token = "8c70fd8419398999c9ac5bacf3192882193cadf2",
  .api_url = "https://github.foobar.edu/api/v3"
)
\dontshow{\}) # examplesIf}
}


================================================
FILE: man/print.gh_response.Rd
================================================
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/print.R
\name{print.gh_response}
\alias{print.gh_response}
\title{Print the result of a GitHub API call}
\usage{
\method{print}{gh_response}(x, ...)
}
\arguments{
\item{x}{The result object.}

\item{...}{Ignored.}
}
\value{
The JSON result.
}
\description{
Print the result of a GitHub API call
}


================================================
FILE: tests/testthat/_snaps/gh.md
================================================
# generates a useful message

    Code
      gh("/missing")
    Condition
      Error in `gh()`:
      ! GitHub API error (404): Not Found
      x URL not found: <https://api.github.com/missing>
      i Read more at <https://docs.github.com/rest>

# can use per_page or .per_page but not both

    Code
      gh("/orgs/tidyverse/repos", per_page = 1, .per_page = 2)
    Condition
      Error in `gh()`:
      ! Exactly one of `per_page` or `.per_page` must be supplied.



================================================
FILE: tests/testthat/_snaps/gh_rate_limit.md
================================================
# errors

    Code
      gh_rate_limit(list())
    Condition
      Error in `gh_rate_limit()`:
      ! inherits(response, "gh_response") is not TRUE
    Code
      gh_rate_limits(.token = "bad")
    Condition
      Error in `gh()`:
      ! GitHub API error (401): Bad credentials
      i Read more at <https://docs.github.com/rest>



================================================
FILE: tests/testthat/_snaps/gh_request.md
================================================
# gh_set_endpoint() refuses to substitute an NA

    Code
      gh_set_endpoint(input)
    Condition
      Error in `gh_set_endpoint()`:
      ! Named NA parameters are not allowed: org

# gh_make_request() errors if unknown verb

    Unknown HTTP verb: "GEEET"



================================================
FILE: tests/testthat/_snaps/gh_response.md
================================================
# warns if output is HTML

    Code
      res <- gh("POST /markdown", text = "foo")
    Condition
      Warning:
      Response came back as html :(



================================================
FILE: tests/testthat/_snaps/gh_token.md
================================================
# get_baseurl() insists on http(s)

    Code
      get_baseurl("github.com")
    Condition
      Error in `get_baseurl()`:
      ! Only works with HTTP(S) protocols
    Code
      get_baseurl("github.acme.com")
    Condition
      Error in `get_baseurl()`:
      ! Only works with HTTP(S) protocols



================================================
FILE: tests/testthat/_snaps/gh_whoami.md
================================================
# whoami errors with bad/absent PAT

    Code
      gh_whoami(.token = "")
    Message
      No personal access token (PAT) available.
      Obtain a PAT from here:
      https://github.com/settings/tokens
      For more on what to do with the PAT, see ?gh_whoami.
    Code
      gh_whoami(.token = NA)
    Condition
      Error in `gh()`:
      ! GitHub API error (401): Requires authentication
      i Read more at <https://docs.github.com/rest/users/users#get-the-authenticated-user>
    Code
      gh_whoami(.token = "blah")
    Condition
      Error in `gh()`:
      ! GitHub API error (401): Bad credentials
      i Read more at <https://docs.github.com/rest>



================================================
FILE: tests/testthat/_snaps/pagination.md
================================================
# can extract relative pages

    Code
      gh_prev(page1)
    Condition
      Error in `gh_link_request()`:
      ! No prev page



================================================
FILE: tests/testthat/_snaps/print.md
================================================
# can print all types of object

    Code
      json
    Output
      {
        "name": "LICENSE",
        "path": "LICENSE",
        "sha": "c71242092c79fcc895841ca3e7de5bbcc551cde5",
        "size": 81,
        "url": "https://api.github.com/repos/r-lib/gh/contents/LICENSE?ref=v1.2.0",
        "html_url": "https://github.com/r-lib/gh/blob/v1.2.0/LICENSE",
        "git_url": "https://api.github.com/repos/r-lib/gh/git/blobs/c71242092c79fcc895841ca3e7de5bbcc551cde5",
        "download_url": "https://raw.githubusercontent.com/r-lib/gh/v1.2.0/LICENSE",
        "type": "file",
        "content": "WUVBUjogMjAxNS0yMDIwCkNPUFlSSUdIVCBIT0xERVI6IEfDoWJvciBDc8Oh\ncmRpLCBKZW5uaWZlciBCcnlhbiwgSGFkbGV5IFdpY2toYW0K\n",
        "encoding": "base64",
        "_links": {
          "self": "https://api.github.com/repos/r-lib/gh/contents/LICENSE?ref=v1.2.0",
          "git": "https://api.github.com/repos/r-lib/gh/git/blobs/c71242092c79fcc895841ca3e7de5bbcc551cde5",
          "html": "https://github.com/r-lib/gh/blob/v1.2.0/LICENSE"
        }
      } 
    Code
      file
    Output
      [1] "LICENSE"
      attr(,"class")
      [1] "gh_response" "path"       
    Code
      raw
    Output
       [1] 59 45 41 52 3a 20 32 30 31 35 2d 32 30 32 30 0a 43 4f 50 59 52 49 47 48 54
      [26] 20 48 4f 4c 44 45 52 3a 20 47 c3 a1 62 6f 72 20 43 73 c3 a1 72 64 69 2c 20
      [51] 4a 65 6e 6e 69 66 65 72 20 42 72 79 61 6e 2c 20 48 61 64 6c 65 79 20 57 69
      [76] 63 6b 68 61 6d 0a
      attr(,"class")
      [1] "gh_response" "raw"        



================================================
FILE: tests/testthat/_snaps/utils.md
================================================
# named NA is error

    Code
      check_named_nas(tc)
    Condition
      Error in `check_named_nas()`:
      ! Named NA parameters are not allowed: `a`

---

    Code
      check_named_nas(tc)
    Condition
      Error in `check_named_nas()`:
      ! Named NA parameters are not allowed: `a`

---

    Code
      check_named_nas(tc)
    Condition
      Error in `check_named_nas()`:
      ! Named NA parameters are not allowed: `c`



================================================
FILE: tests/testthat/helper-offline.R
================================================
skip_if_no_github <- function(has_scope = NULL) {
  skip_if_offline("github.com")
  skip_on_cran()

  if (gh_token() == "") {
    skip("No GitHub token")
  }

  if (!is.null(has_scope) && !has_scope %in% test_scopes()) {
    skip(cli::format_inline("Current token lacks '{has_scope}' scope"))
  }
}

test_scopes <- function() {
  # whoami fails on GHA
  whoami <- env_cache(
    cache,
    "whoami",
    tryCatch(
      gh_whoami(),
      error = function(err) list(scopes = "")
    )
  )
  strsplit(whoami$scopes, ", ")[[1]]
}

cache <- new_environment()


================================================
FILE: tests/testthat/helper.R
================================================
test_package_root <- function() {
  x <- tryCatch(
    rprojroot::find_package_root_file(),
    error = function(e) NULL
  )

  if (!is.null(x)) {
    return(x)
  }

  pkg <- testthat::testing_package()
  x <- tryCatch(
    rprojroot::find_package_root_file(
      path = file.path("..", "..", "00_pkg_src", pkg)
    ),
    error = function(e) NULL
  )

  if (!is.null(x)) {
    return(x)
  }

  stop("Cannot find package root")
}


================================================
FILE: tests/testthat/setup.R
================================================
withr::local_options(
  gh_cache = FALSE,
  .local_envir = testthat::teardown_env()
)


================================================
FILE: tests/testthat/test-gh.R
================================================
test_that("generates a useful message", {
  skip_if_no_github()

  expect_snapshot(gh("/missing"), error = TRUE)
})

test_that("errors return a github_error object", {
  skip_if_no_github()

  e <- tryCatch(gh("/missing"), error = identity)

  expect_s3_class(e, "github_error")
  expect_s3_class(e, "http_error_404")
})

test_that("can catch a given status directly", {
  skip_if_no_github()

  e <- tryCatch(gh("/missing"), "http_error_404" = identity)

  expect_s3_class(e, "github_error")
  expect_s3_class(e, "http_error_404")
})

test_that("can ignore trailing commas", {
  skip_on_cran()
  expect_no_error(gh("/orgs/tidyverse/repos", ))
})

test_that("can use per_page or .per_page but not both", {
  skip_on_cran()
  resp <- gh("/orgs/tidyverse/repos", per_page = 2)
  expect_equal(attr(resp, "request")$query$per_page, 2)

  resp <- gh("/orgs/tidyverse/repos", .per_page = 2)
  expect_equal(attr(resp, "request")$query$per_page, 2)

  expect_snapshot(
    error = TRUE,
    gh("/orgs/tidyverse/repos", per_page = 1, .per_page = 2)
  )
})

test_that("can paginate", {
  skip_on_cran()
  pages <- gh(
    "/orgs/tidyverse/repos",
    per_page = 1,
    .limit = 5,
    .progress = FALSE
  )
  expect_length(pages, 5)
})

test_that("trim output when .limit isn't a multiple of .per_page", {
  skip_on_cran()
  pages <- gh(
    "/orgs/tidyverse/repos",
    per_page = 2,
    .limit = 3,
    .progress = FALSE
  )
  expect_length(pages, 3)
})

test_that("can paginate repository search", {
  skip_on_cran()
  # we need to run this sparingly, otherwise we'll get rate
  # limited and the test fails
  skip_on_ci()
  pages <- gh(
    "/search/repositories",
    q = "tidyverse",
    per_page = 10,
    .limit = 35
  )
  expect_named(pages, c("total_count", "incomplete_results", "items"))
  # Eliminates aren't trimmed to .limit in this case
  expect_length(pages$items, 40)
})


================================================
FILE: tests/testthat/test-gh_rate_limit.R
================================================
test_that("good input", {
  mock_res <- structure(
    list(),
    class = "gh_response",
    response = list(
      "x-ratelimit-limit" = "5000",
      "x-ratelimit-remaining" = "4999",
      "x-ratelimit-reset" = "1580507619"
    )
  )

  limit <- gh_rate_limit(mock_res)

  expect_equal(limit$limit, 5000L)
  expect_equal(limit$remaining, 4999L)
  expect_s3_class(limit$reset, "POSIXct") # Avoiding tz issues
})

test_that("errors", {
  expect_snapshot(error = TRUE, {
    gh_rate_limit(list())
    gh_rate_limits(.token = "bad")
  })
})

test_that("missing rate limit", {
  mock_res <- structure(
    list(),
    class = "gh_response",
    response = list()
  )

  limit <- gh_rate_limit(mock_res)

  expect_equal(limit$limit, NA_integer_)
  expect_equal(limit$remaining, NA_integer_)
  expect_equal(as.double(limit$reset), NA_real_)
})


================================================
FILE: tests/testthat/test-gh_request.R
================================================
test_that("all forms of specifying endpoint are equivalent", {
  r1 <- gh_build_request("GET /rate_limit")
  expect_equal(r1$method, "GET")
  expect_equal(r1$url, "https://api.github.com/rate_limit")

  expect_equal(gh_build_request("/rate_limit"), r1)
  expect_equal(gh_build_request("GET https://api.github.com/rate_limit"), r1)
  expect_equal(gh_build_request("https://api.github.com/rate_limit"), r1)
})

test_that("method arg sets default method", {
  r <- gh_build_request("/rate_limit", method = "POST")
  expect_equal(r$method, "POST")
})

test_that("parameter substitution is equivalent to direct specification (:)", {
  subst <-
    gh_build_request(
      "POST /repos/:org/:repo/issues/:number/labels",
      params = list(
        org = "ORG",
        repo = "REPO",
        number = "1",
        "body"
      )
    )
  spec <-
    gh_build_request(
      "POST /repos/ORG/REPO/issues/1/labels",
      params = list("body")
    )
  expect_identical(subst, spec)
})

test_that("parameter substitution is equivalent to direct specification", {
  subst <-
    gh_build_request(
      "POST /repos/{org}/{repo}/issues/{number}/labels",
      params = list(
        org = "ORG",
        repo = "REPO",
        number = "1",
        "body"
      )
    )
  spec <-
    gh_build_request(
      "POST /repos/ORG/REPO/issues/1/labels",
      params = list("body")
    )
  expect_identical(subst, spec)
})

test_that("URI templates that need expansion are detected", {
  expect_true(is_uri_template("/orgs/{org}/repos"))
  expect_true(is_uri_template("/repos/{owner}/{repo}"))
  expect_false(is_uri_template("/user/repos"))
})

test_that("older 'colon templates' are detected", {
  expect_true(is_colon_template("/orgs/:org/repos"))
  expect_true(is_colon_template("/repos/:owner/:repo"))
  expect_false(is_colon_template("/user/repos"))
})

test_that("gh_set_endpoint() works", {
  # no expansion, no extra params
  input <- list(endpoint = "/user/repos")
  expect_equal(input, gh_set_endpoint(input))

  # no expansion, with extra params
  input <- list(endpoint = "/user/repos", params = list(page = 2))
  expect_equal(input, gh_set_endpoint(input))

  # expansion, no extra params
  input <- list(
    endpoint = "/repos/{owner}/{repo}",
    params = list(owner = "OWNER", repo = "REPO")
  )
  out <- gh_set_endpoint(input)
  expect_equal(
    out,
    list(endpoint = "/repos/OWNER/REPO", params = list())
  )

  # expansion, with extra params
  input <- list(
    endpoint = "/repos/{owner}/{repo}/issues",
    params = list(state = "open", owner = "OWNER", repo = "REPO", page = 2)
  )
  out <- gh_set_endpoint(input)
  expect_equal(out$endpoint, "/repos/OWNER/REPO/issues")
  expect_equal(out$params, list(state = "open", page = 2))
})

test_that("gh_set_endpoint() refuses to substitute an NA", {
  input <- list(
    endpoint = "POST /orgs/{org}/repos",
    params = list(org = NA)
  )
  expect_snapshot(error = TRUE, gh_set_endpoint(input))
})

test_that("gh_set_endpoint() allows a named NA in body for non-GET", {
  input <- list(
    endpoint = "PUT /repos/{owner}/{repo}/pages",
    params = list(owner = "OWNER", repo = "REPO", cname = NA)
  )
  out <- gh_set_endpoint(input)
  expect_equal(out$endpoint, "PUT /repos/OWNER/REPO/pages")
  expect_equal(out$params, list(cname = NA))
})

test_that("gh_set_url() ensures URL is in 'API form'", {
  input <- list(
    endpoint = "/user/repos",
    api_url = "https://github.com"
  )
  out <- gh_set_url(input)
  expect_equal(out$api_url, "https://api.github.com")

  input$api_url <- "https://github.acme.com"
  out <- gh_set_url(input)
  expect_equal(out$api_url, "https://github.acme.com/api/v3")
})

test_that("gh_make_request() errors if unknown verb", {
  expect_snapshot_error(gh("geeet /users/hadley/repos", .limit = 2))
})


================================================
FILE: tests/testthat/test-gh_response.R
================================================
test_that("works with empty bodies", {
  skip_if_no_github()

  out <- gh("GET /orgs/{org}/repos", org = "gh-org-testing-no-repos")
  expect_equal(out, list(), ignore_attr = TRUE)

  out <- gh("POST /markdown", text = "")
  expect_equal(out, list(), ignore_attr = TRUE)
})

test_that("works with empty bodies from DELETE", {
  skip_if_no_github(has_scope = "gist")

  out <- gh(
    "POST /gists",
    files = list(x = list(content = "y")),
    public = FALSE
  )
  out <- gh("DELETE /gists/{gist_id}", gist_id = out$id)
  expect_equal(out, list(), ignore_attr = TRUE)
})

test_that("can get raw response", {
  skip_if_no_github()

  res <- gh(
    "GET /repos/{owner}/{repo}/contents/{path}",
    owner = "r-lib",
    repo = "gh",
    path = "DESCRIPTION",
    .send_headers = c(Accept = "application/vnd.github.v3.raw")
  )

  expect_equal(
    attr(res, "response")[["x-github-media-type"]],
    "github.v3; param=raw"
  )
  expect_equal(class(res), c("gh_response", "raw"))
})

test_that("can download files", {
  skip_if_no_github()

  tmp <- withr::local_tempfile()
  res_file <- gh(
    "/orgs/{org}/repos",
    org = "r-lib",
    type = "sources",
    .destfile = tmp
  )
  expect_equal(class(res_file), c("gh_response", "path"))
  expect_equal(res_file, tmp, ignore_attr = TRUE)
})

test_that("warns if output is HTML", {
  skip_on_cran()
  expect_snapshot(res <- gh("POST /markdown", text = "foo"))

  expect_equal(res, list(message = "<p>foo</p>\n"), ignore_attr = TRUE)
  expect_equal(class(res), c("gh_response", "list"))
})

test_that("captures details to recreate request", {
  skip_on_cran()
  res <- gh("/orgs/{org}/repos", org = "r-lib", .per_page = 1)

  req <- attr(res, "request")
  expect_type(req, "list")
  expect_equal(req$url, "https://api.github.com/orgs/r-lib/repos")
  expect_equal(req$query, list(per_page = 1))
})

test_that("output file is not overwritten on error", {
  tmp <- withr::local_tempfile()
  writeLines("foo", tmp)

  err <- tryCatch(
    gh("/repos", .destfile = tmp),
    error = function(e) e
  )

  expect_true(file.exists(tmp))
  expect_equal(readLines(tmp), "foo")
  expect_true(!is.null((err$response_content)))
})


test_that("gh_response objects can be combined via vctrs #161", {
  skip_on_cran()
  skip_if_not_installed("vctrs")
  user_1 <- gh("/users", .limit = 1)
  user_2 <- gh("/users", .limit = 1, )
  user_vec <- vctrs::vec_c(user_1, user_2)
  user_df <- vctrs::vec_rbind(user_1[[1]], user_2[[1]])
  expect_equal(length(user_vec), 2)
  expect_equal(nrow(user_df), 2)
})


================================================
FILE: tests/testthat/test-gh_token.R
================================================
test_that("URL specific token is used", {
  good <- gh_pat(strrep("a", 40))
  good2 <- gh_pat(strrep("b", 40))
  bad <- gh_pat(strrep("0", 40))
  bad2 <- gh_pat(strrep("1", 40))

  env <- c(
    GITHUB_API_URL = "https://github.acme.com",
    GITHUB_PAT_GITHUB_ACME_COM = good,
    GITHUB_PAT_GITHUB_ACME2_COM = good2,
    GITHUB_PAT = bad,
    GITHUB_TOKEN = bad2
  )
  withr::with_envvar(env, {
    expect_equal(gh_token(), good)
    expect_equal(gh_token("https://github.acme2.com"), good2)
  })

  env <- c(
    GITHUB_API_URL = NA,
    GITHUB_PAT_GITHUB_COM = good,
    GITHUB_PAT = bad,
    GITHUB_TOKEN = bad2
  )
  withr::with_envvar(env, {
    expect_equal(gh_token(), good)
    expect_equal(gh_token("https://api.github.com"), good)
  })
})

test_that("fall back to GITHUB_PAT, then GITHUB_TOKEN", {
  pat <- gh_pat(strrep("a", 40))
  token <- gh_pat(strrep("0", 40))

  env <- c(
    GITHUB_API_URL = NA,
    GITHUB_PAT_GITHUB_COM = NA,
    GITHUB_PAT = pat,
    GITHUB_TOKEN = token
  )
  withr::with_envvar(env, {
    expect_equal(gh_token(), pat)
    expect_equal(gh_token("https://api.github.com"), pat)
  })

  env <- c(
    GITHUB_API_URL = NA,
    GITHUB_PAT_GITHUB_COM = NA,
    GITHUB_PAT = NA,
    GITHUB_TOKEN = token
  )
  withr::with_envvar(env, {
    expect_equal(gh_token(), token)
    expect_equal(gh_token("https://api.github.com"), token)
  })
})

test_that("gh_token_exists works as expected", {
  withr::local_envvar(GITHUB_API_URL = "https://test.com")

  withr::local_envvar(GITHUB_PAT_TEST_COM = NA)
  expect_false(gh_token_exists())

  withr::local_envvar(GITHUB_PAT_TEST_COM = gh_pat(strrep("0", 40)))
  expect_true(gh_token_exists())

  withr::local_envvar(GITHUB_PAT_TEST_COM = "invalid")
  expect_false(gh_token_exists())
})

# gh_pat class ----
test_that("validate_gh_pat() rejects bad characters, wrong # of characters", {
  # older PATs
  expect_error(gh_pat(strrep("a", 40)), NA)
  expect_error(
    gh_pat(strrep("g", 40)),
    "40 hexadecimal digits",
    class = "error"
  )
  expect_error(gh_pat("aa"), "40 hexadecimal digits", class = "error")

  # newer PATs
  expect_error(gh_pat(paste0("ghp_", strrep("B", 36))), NA)
  expect_error(gh_pat(paste0("ghp_", strrep("3", 251))), NA)
  expect_error(gh_pat(paste0("github_pat_", strrep("A", 36))), NA)
  expect_error(gh_pat(paste0("github_pat_", strrep("3", 244))), NA)
  expect_error(
    gh_pat(paste0("ghJ_", strrep("a", 36))),
    "prefix",
    class = "error"
  )
  expect_error(
    gh_pat(paste0("github_pa_", strrep("B", 244))),
    "github_pat_",
    class = "error"
  )
})

test_that("format.gh_pat() and str.gh_pat() hide the middle stuff", {
  pat <- paste0(strrep("a", 10), strrep("4", 20), strrep("F", 10))
  expect_match(format(gh_pat(pat)), "[a-zA-Z]+")
  expect_output(str(gh_pat(pat)), "[a-zA-Z]+")
})

test_that("str.gh_pat() indicates it's a `gh_pat`", {
  pat <- paste0(strrep("a", 10), strrep("4", 20), strrep("F", 10))
  expect_output(str(gh_pat(pat)), "gh_pat")
})

test_that("format.gh_pat() handles empty string", {
  expect_match(format(gh_pat("")), "<no PAT>")
})

# URL processing helpers ----
test_that("get_baseurl() insists on http(s)", {
  expect_snapshot(error = TRUE, {
    get_baseurl("github.com")
    get_baseurl("github.acme.com")
  })
})

test_that("get_baseurl() works", {
  x <- "https://github.com"
  expect_equal(get_baseurl("https://github.com"), x)
  expect_equal(get_baseurl("https://github.com/"), x)
  expect_equal(get_baseurl("https://github.com/stuff"), x)
  expect_equal(get_baseurl("https://github.com/stuff/"), x)
  expect_equal(get_baseurl("https://github.com/more/stuff"), x)

  x <- "https://api.github.com"
  expect_equal(get_baseurl("https://api.github.com"), x)
  expect_equal(get_baseurl("https://api.github.com/rate_limit"), x)

  x <- "https://github.acme.com"
  expect_equal(get_baseurl("https://github.acme.com"), x)
  expect_equal(get_baseurl("https://github.acme.com/"), x)
  expect_equal(get_baseurl("https://github.acme.com/api/v3"), x)

  # so (what little) support we have for user@host doesn't regress
  expect_equal(
    get_baseurl("https://jane@github.acme.com/api/v3"),
    "https://jane@github.acme.com"
  )
})

test_that("is_github_dot_com() works", {
  expect_true(is_github_dot_com("https://github.com"))
  expect_true(is_github_dot_com("https://api.github.com"))
  expect_true(is_github_dot_com("https://api.github.com/rate_limit"))
  expect_true(is_github_dot_com("https://api.github.com/graphql"))

  expect_false(is_github_dot_com("https://github.acme.com"))
  expect_false(is_github_dot_com("https://github.acme.com/api/v3"))
  expect_false(is_github_dot_com("https://github.acme.com/api/v3/user"))
})

test_that("get_hosturl() works", {
  x <- "https://github.com"
  expect_equal(get_hosturl("https://github.com"), x)
  expect_equal(get_hosturl("https://api.github.com"), x)

  x <- "https://github.acme.com"
  expect_equal(get_hosturl("https://github.acme.com"), x)
  expect_equal(get_hosturl("https://github.acme.com/api/v3"), x)
})

test_that("get_apiurl() works", {
  x <- "https://api.github.com"
  expect_equal(get_apiurl("https://github.com"), x)
  expect_equal(get_apiurl("https://github.com/"), x)
  expect_equal(get_apiurl("https://github.com/r-lib/gh/issues"), x)
  expect_equal(get_apiurl("https://api.github.com"), x)
  expect_equal(get_apiurl("https://api.github.com/rate_limit"), x)

  x <- "https://github.acme.com/api/v3"
  expect_equal(get_apiurl("https://github.acme.com"), x)
  expect_equal(get_apiurl("https://github.acme.com/OWNER/REPO"), x)
  expect_equal(get_apiurl("https://github.acme.com/api/v3"), x)
})

test_that("tokens can be requested from a Connect server", {
  skip_if_not_installed("connectcreds")

  token <- strrep("a", 40)
  connectcreds::local_mocked_connect_responses(token = token)
  expect_equal(gh_token(), gh_pat(token))
})


================================================
FILE: tests/testthat/test-gh_whoami.R
================================================
test_that("whoami works in presence of PAT", {
  skip_if_no_github(has_scope = "user")

  res <- gh_whoami()
  expect_s3_class(res, "gh_response")
  expect_match(res[["scopes"]], "\\buser\\b")
})

test_that("whoami errors with bad/absent PAT", {
  skip_if_no_github()
  skip_on_ci() # since no token sometimes fails due to rate-limiting
  withr::local_envvar(GH_FORCE_HTTP_1_1 = "true")

  expect_snapshot(error = TRUE, {
    gh_whoami(.token = "")
    gh_whoami(.token = NA)
    gh_whoami(.token = "blah")
  })
})


================================================
FILE: tests/testthat/test-git.R
================================================
test_that("picks origin if available", {
  remotes <- list(
    upstream = "https://github.com/x/1",
    origin = "https://github.com/x/2"
  )

  expect_warning(gr <- github_remote(remotes, "."), "Using origin")
  expect_equal(gr$repo, "2")
})

test_that("otherwise picks first", {
  remotes <- list(
    a = "https://github.com/x/1",
    b = "https://github.com/x/2"
  )

  expect_warning(gr <- github_remote(remotes, "."), "Using first")
  expect_equal(gr$repo, "1")
})


# Parsing -----------------------------------------------------------------

test_that("parses common url forms", {
  expected <- list(username = "x", repo = "y")

  expect_equal(github_remote_parse("https://github.com/x/y.git"), expected)
  expect_equal(github_remote_parse("https://github.com/x/y"), expected)
  expect_equal(github_remote_parse("git@github.com:x/y.git"), expected)
})

test_that("returns NULL if can't parse", {
  expect_equal(github_remote_parse("blah"), NULL)
})


================================================
FILE: tests/testthat/test-mock-repos.R
================================================
if (!exists("TMPL", environment(), inherits = FALSE)) {
  TMPL <- function(x) x
}

test_that("repos, some basics", {
  skip_if_no_github()

  res <- gh(
    TMPL("/users/{username}/repos"),
    username = "gaborcsardi"
  )
  expect_true(all(c("id", "name", "full_name") %in% names(res[[1]])))

  res <- gh(
    TMPL("/orgs/{org}/repos"),
    org = "r-lib",
    type = "sources",
    sort = "full_name"
  )
  expect_true("actions" %in% vapply(res, "[[", "name", FUN.VALUE = ""))

  res <- gh("/repositories")
  expect_true(all(c("id", "name", "full_name") %in% names(res[[1]])))
})

test_that("can POST, PATCH, and DELETE", {
  skip_if_no_github(has_scope = "gist")

  res <- gh(
    "POST /gists",
    files = list(test.R = list(content = "test")),
    description = "A test gist for gh",
    public = FALSE
  )
  expect_equal(res$description, "A test gist for gh")
  expect_false(res$public)

  res <- gh(
    TMPL("PATCH /gists/{gist_id}"),
    gist_id = res$id,
    description = "Still a test repo"
  )
  expect_equal(res$description, "Still a test repo")

  res <- gh(
    TMPL("DELETE /gists/{gist_id}"),
    gist_id = res$id
  )
  expect_s3_class(res, c("gh_response", "list"))
})


================================================
FILE: tests/testthat/test-old-templates.R
================================================
TMPL <- function(x) {
  gsub("[{]([^}]+)[}]", ":\\1", x)
}

source("test-mock-repos.R", local = TRUE)


================================================
FILE: tests/testthat/test-pagination.R
================================================
test_that("can extract relative pages", {
  skip_on_cran()
  page1 <- gh("/orgs/tidyverse/repos", per_page = 1)
  expect_true(gh_has(page1, "next"))
  expect_false(gh_has(page1, "prev"))

  page2 <- gh_next(page1)
  expect_equal(
    attr(page2, "request")$url,
    "https://api.github.com/organizations/22032646/repos?per_page=1&page=2"
  )
  expect_true(gh_has(page2, "prev"))

  expect_snapshot(gh_prev(page1), error = TRUE)
})

test_that("can paginate even when space re-encoded to +", {
  skip_on_cran()
  json <- gh::gh(
    "GET /search/issues",
    q = 'label:"tidy-dev-day :nerd_face:"',
    per_page = 10,
    .limit = 20
  )
  expect_length(json$items, 20)
})

test_that("paginated request gets max_wait and max_rate", {
  skip_on_cran()
  gh <- gh("/orgs/tidyverse/repos", per_page = 5, .max_wait = 1, .max_rate = 10)

  req <- gh_link_request(gh, "next", .token = NULL, .send_headers = NULL)
  expect_equal(req$max_wait, 1)
  expect_equal(req$max_rate, 10)

  url <- httr2::url_parse(req$url)
  expect_equal(url$query$page, "2")
})


================================================
FILE: tests/testthat/test-print.R
================================================
test_that("can print all types of object", {
  skip_on_cran()
  local_options(gh_cache = FALSE)

  get_license <- function(...) {
    gh(
      "GET /repos/{owner}/{repo}/contents/{path}",
      owner = "r-lib",
      repo = "gh",
      path = "LICENSE",
      ref = "v1.2.0",
      ...
    )
  }

  json <- get_license()
  raw <- get_license(
    .send_headers = c(Accept = "application/vnd.github.v3.raw")
  )

  path <- withr::local_file(test_path("LICENSE"))
  file <- get_license(
    .destfile = path,
    .send_headers = c(Accept = "application/vnd.github.v3.raw")
  )

  expect_snapshot({
    json
    file
    raw
  })
})


================================================
FILE: tests/testthat/test-spelling.R
================================================
test_that("spelling", {
  skip_on_cran()
  skip_on_covr()
  pkgroot <- test_package_root()
  err <- spelling::spell_check_package(pkgroot)
  num_spelling_errors <- nrow(err)
  expect_true(
    num_spelling_errors == 0,
    info = paste(
      c("\nSpelling errors:", capture.output(err)),
      collapse = "\n"
    )
  )
})


================================================
FILE: tests/testthat/test-utils.R
================================================
test_that("can detect presence vs absence names", {
  expect_identical(has_name(list("foo", "bar")), c(FALSE, FALSE))
  expect_identical(has_name(list(a = "foo", "bar")), c(TRUE, FALSE))

  expect_identical(
    has_name({
      x <- list("foo", "bar")
      names(x)[1] <- "a"
      x
    }),
    c(TRUE, FALSE)
  )
  expect_identical(
    has_name({
      x <- list("foo", "bar")
      names(x)[1] <- "a"
      names(x)[2] <- ""
      x
    }),
    c(TRUE, FALSE)
  )

  expect_identical(
    has_name({
      x <- list("foo", "bar")
      names(x)[1] <- ""
      x
    }),
    c(FALSE, FALSE)
  )
  expect_identical(
    has_name({
      x <- list("foo", "bar")
      names(x)[1] <- ""
      names(x)[2] <- ""
      x
    }),
    c(FALSE, FALSE)
  )
})

test_that("named NULL is dropped", {
  tcs <- list(
    list(list(), list()),
    list(list(a = 1), list(a = 1)),
    list(list(NULL), list(NULL)),
    list(list(a = NULL), list()),
    list(list(NULL, a = NULL, 1), list(NULL, 1)),
    list(list(a = NULL, b = 1, 5), list(b = 1, 5))
  )

  for (tc in tcs) {
    expect_identical(
      drop_named_nulls(tc[[1]]),
      tc[[2]],
      info = tc
    )
  }
})

test_that("named NA is error", {
  goodtcs <- list(
    list(),
    list(NA),
    list(NA, NA_integer_, a = 1)
  )

  badtcs <- list(
    list(b = NULL, a = NA),
    list(a = NA_integer_),
    list(NA, c = NA_real_)
  )

  for (tc in goodtcs) {
    expect_silent(check_named_nas(tc))
  }

  for (tc in badtcs) {
    expect_snapshot(error = TRUE, check_named_nas(tc))
  }
})


test_that(".parse_params combines list .params with ... params", {
  params <- list(
    .parse_params(org = "ORG", repo = "REPO", number = "1"),
    .parse_params(org = "ORG", repo = "REPO", .params = list(number = "1")),
    .parse_params(.params = list(org = "ORG", repo = "REPO", number = "1"))
  )

  expect_identical(params[[1]], params[[2]])
  expect_identical(params[[2]], params[[3]])
})


================================================
FILE: tests/testthat.R
================================================
library(testthat)
library(gh)

if (Sys.getenv("NOT_CRAN") == "true") {
  test_check("gh")
}


================================================
FILE: vignettes/.gitignore
================================================
*.html
*.R


================================================
FILE: vignettes/managing-personal-access-tokens.Rmd
================================================
---
title: "Managing Personal Access Tokens"
output: rmarkdown::html_vignette
vignette: >
  %\VignetteIndexEntry{Managing Personal Access Tokens}
  %\VignetteEngine{knitr::rmarkdown}
  %\VignetteEncoding{UTF-8}
---

```{r}
#| include: false
knitr::opts_chunk$set(
  collapse = TRUE,
  comment = "#>"
)
```

```{r}
#| label: setup
library(gh)
```

<!-- This vignette uses a convention of "one sentence per line" in prose. -->

gh generally sends a Personal Access Token (PAT) with its requests.
Some endpoints of the GitHub API can be accessed without authenticating yourself.
But once your API use becomes more frequent, you will want a PAT to prevent problems with rate limits and to access all possible endpoints.

This article describes how to store your PAT, so that gh can find it (automatically, in most cases). The function gh uses for this is `gh_token()`.

More resources on PAT management:

  * GitHub documentation on [Creating a personal access token](https://docs.github.com/en/github/authenticating-to-github/creating-a-personal-access-token)
    - Important: a PAT can expire, the default expiration date is 30 days. 
  * In the [usethis package](https://usethis.r-lib.org):
    - Vignette: [Managing Git(Hub) Credentials](https://usethis.r-lib.org/articles/articles/git-credentials.html) 
    - `usethis::gh_token_help()` and `usethis::git_sitrep()` help you check if
      a PAT is discoverable and has suitable scopes
    - `usethis::create_github_token()` guides you through the process of getting
      a new PAT
  * In the [gitcreds package](https://gitcreds.r-lib.org/):
    - `gitcreds::gitcreds_set()` helps you explicitly put your PAT into the Git
      credential store
  
## PAT and host

`gh::gh()` allows the user to provide a PAT via the `.token` argument and to specify a host other than "github.com" via the `.api_url` argument.
(Some companies and universities run their own instance of GitHub Enterprise.)

```{r}
#| eval: false
gh(endpoint, ..., .token = NULL, ..., .api_url = NULL, ...)
```

However, it's annoying to always provide your PAT or host and it's unsafe for your PAT to appear explicitly in your R code.
It's important to make it *possible* for the user to provide the PAT and/or API URL directly, but it should rarely be necessary.
`gh::gh()` is designed to play well with more secure, less fiddly methods for expressing what you want.

How are `.api_url` and `.token` determined when the user does not provide them?

  1. `.api_url` defaults to the value of the `GITHUB_API_URL` environment
    variable and, if that is unset, falls back to `"https://api.github.com"`.
    This is always done before worrying about the PAT.
  1. The PAT is obtained via a call to `gh_token(.api_url)`. That is, the token
    is looked up based on the host.

## The gitcreds package

gh now uses the gitcreds package to interact with the Git credential store.

gh calls `gitcreds::gitcreds_get()` with a URL to try to find a matching PAT.
`gitcreds::gitcreds_get()` checks session environment variables and then the local Git credential store.
Therefore, if you have previously used a PAT with, e.g., command line Git, gh may retrieve and re-use it.
You can call `gitcreds::gitcreds_get()` directly, yourself, if you want to see what is found for a specific URL.

``` r
gitcreds::gitcreds_get()
```

If you see something like this:
``` r
#> <gitcreds>
#>   protocol: https
#>   host    : github.com
#>   username: PersonalAccessToken
#>   password: <-- hidden -->
```
that means that gitcreds could get the PAT from the Git credential store.
You can call `gitcreds_get()$password` to see the actual PAT.

If no matching PAT is found, `gitcreds::gitcreds_get()` errors.

## PAT in an environment variable

If you don't have a Git installation, or your Git installation does not have a working credential store, then you can specify the PAT in an environment variable.
For `github.com` you can set the `GITHUB_PAT_GITHUB_COM` or `GITHUB_PAT` variable.
For a different GitHub host, call `gitcreds::gitcreds_cache_envvar()` with the API URL to see the environment variable you need to set.
For example:

```{r}
gitcreds::gitcreds_cache_envvar("https://github.acme.com")
```

## Recommendations

On a machine used for interactive development, we recommend:

  * Store your PAT(s) in an official credential store.
  * Do **not** store your PAT(s) in plain text in, e.g., `.Renviron`. In the
    past, this has been a common and recommended practice for pragmatic reasons.
    However, gitcreds/gh have now evolved to the point where it's
    possible for all of us to follow better security practices.
  * If you use a general-purpose password manager, like 1Password or LastPass,
    you may *also* want to store your PAT(s) there. Why? If your PAT is
    "forgotten" from the OS-level credential store, intentionally or not, you'll
    need to provide it again when prompted.
    
    If you don't have any other record of your PAT, you'll have to get a new
    PAT whenever this happens. This is not the end of the world. But if you
    aren't disciplined about deleting lost PATs from
    <https://github.com/settings/tokens>, you will eventually find yourself in a
    confusing situation where you can't be sure which PAT(s) are in use.

On a headless system, such as on a CI/CD platform, provide the necessary PAT(s) via secure environment variables.
Regular environment variables can be used to configure less sensitive settings, such as the API host.
Don't expose your PAT by doing something silly like dumping all environment variables to a log file.

Note that on GitHub Actions, specifically, a personal access token is [automatically available to the workflow](https://docs.github.com/en/actions/configuring-and-managing-workflows/authenticating-with-the-github_token) as the `GITHUB_TOKEN` secret.
That is why many workflows in the R community contain this snippet:

``` yaml
env:
  GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}
```

This makes the automatic PAT available as the `GITHUB_PAT` environment variable.
If that PAT doesn't have the right permissions, then you'll need to explicitly provide one that does (see link above for more).

## Failure

If there is no PAT to be had, `gh::gh()` sends a request with no token.
(Internally, the `Authorization` header is omitted if the PAT is found to be the empty string, `""`.)

What do PAT-related failures look like?

If no PAT is sent and the endpoint requires no auth, the request probably succeeds!
At least until you run up against rate limits.
If the endpoint requires auth, you'll get an HTTP error, possibly this one:

```
GitHub API error (401): 401 Unauthorized
Message: Requires authentication
```

If a PAT is first discovered in an environment variable, it is taken at face value.
The two most common ways to arrive here are PAT specification via `.Renviron` or as a secret in a CI/CD platform, such as GitHub Actions.
If the PAT is invalid, the first affected request will fail, probably like so:

```
GitHub API error (401): 401 Unauthorized
Message: Bad credentials
```

This will also be the experience if an invalid PAT is provided directly via `.token`.

Even a valid PAT can lead to a downstream error, if it has insufficient scopes with respect to a specific request.
Download .txt
gitextract_n0is0qh2/

├── .Rbuildignore
├── .github/
│   ├── .gitignore
│   ├── CODEOWNERS
│   ├── CODE_OF_CONDUCT.md
│   └── workflows/
│       ├── R-CMD-check.yaml
│       ├── pkgdown.yaml
│       ├── pr-commands.yaml
│       ├── rhub.yaml
│       └── test-coverage.yaml
├── .gitignore
├── .vscode/
│   ├── extensions.json
│   └── settings.json
├── DESCRIPTION
├── LICENSE
├── LICENSE.md
├── Makefile
├── NAMESPACE
├── NEWS.md
├── R/
│   ├── gh-package.R
│   ├── gh.R
│   ├── gh_gql.R
│   ├── gh_rate_limit.R
│   ├── gh_request.R
│   ├── gh_response.R
│   ├── gh_token.R
│   ├── gh_whoami.R
│   ├── git.R
│   ├── import-standalone-purrr.R
│   ├── pagination.R
│   ├── print.R
│   └── utils.R
├── README.Rmd
├── README.md
├── _pkgdown.yml
├── air.toml
├── codecov.yml
├── gh.Rproj
├── inst/
│   └── WORDLIST
├── man/
│   ├── gh-package.Rd
│   ├── gh.Rd
│   ├── gh_gql.Rd
│   ├── gh_next.Rd
│   ├── gh_rate_limit.Rd
│   ├── gh_token.Rd
│   ├── gh_tree_remote.Rd
│   ├── gh_whoami.Rd
│   └── print.gh_response.Rd
├── tests/
│   ├── testthat/
│   │   ├── _snaps/
│   │   │   ├── gh.md
│   │   │   ├── gh_rate_limit.md
│   │   │   ├── gh_request.md
│   │   │   ├── gh_response.md
│   │   │   ├── gh_token.md
│   │   │   ├── gh_whoami.md
│   │   │   ├── pagination.md
│   │   │   ├── print.md
│   │   │   └── utils.md
│   │   ├── helper-offline.R
│   │   ├── helper.R
│   │   ├── setup.R
│   │   ├── test-gh.R
│   │   ├── test-gh_rate_limit.R
│   │   ├── test-gh_request.R
│   │   ├── test-gh_response.R
│   │   ├── test-gh_token.R
│   │   ├── test-gh_whoami.R
│   │   ├── test-git.R
│   │   ├── test-mock-repos.R
│   │   ├── test-old-templates.R
│   │   ├── test-pagination.R
│   │   ├── test-print.R
│   │   ├── test-spelling.R
│   │   └── test-utils.R
│   └── testthat.R
└── vignettes/
    ├── .gitignore
    └── managing-personal-access-tokens.Rmd
Condensed preview — 75 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (147K chars).
[
  {
    "path": ".Rbuildignore",
    "chars": 304,
    "preview": "^.*\\.Rproj$\n^\\.Rproj\\.user$\n^Makefile$\n^README.Rmd$\n^.travis.yml$\n^appveyor.yml$\n^tags$\n^tests/testthat/github-token\\.tx"
  },
  {
    "path": ".github/.gitignore",
    "chars": 7,
    "preview": "*.html\n"
  },
  {
    "path": ".github/CODEOWNERS",
    "chars": 114,
    "preview": "# CODEOWNERS for gh\n# https://www.tidyverse.org/development/understudies\n.github/CODEOWNERS @gaborcsardi @jennybc\n"
  },
  {
    "path": ".github/CODE_OF_CONDUCT.md",
    "chars": 5244,
    "preview": "# Contributor Covenant Code of Conduct\n\n## Our Pledge\n\nWe as members, contributors, and leaders pledge to make participa"
  },
  {
    "path": ".github/workflows/R-CMD-check.yaml",
    "chars": 1827,
    "preview": "# Workflow derived from https://github.com/r-lib/actions/tree/v2/examples\n# Need help debugging build failures? Start at"
  },
  {
    "path": ".github/workflows/pkgdown.yaml",
    "chars": 1300,
    "preview": "# Workflow derived from https://github.com/r-lib/actions/tree/v2/examples\n# Need help debugging build failures? Start at"
  },
  {
    "path": ".github/workflows/pr-commands.yaml",
    "chars": 2501,
    "preview": "# Workflow derived from https://github.com/r-lib/actions/tree/v2/examples\n# Need help debugging build failures? Start at"
  },
  {
    "path": ".github/workflows/rhub.yaml",
    "chars": 2351,
    "preview": "# R-hub's generic GitHub Actions workflow file. It's canonical location is at\n# https://github.com/r-hub/rhub2/blob/v1/i"
  },
  {
    "path": ".github/workflows/test-coverage.yaml",
    "chars": 1813,
    "preview": "# Workflow derived from https://github.com/r-lib/actions/tree/v2/examples\n# Need help debugging build failures? Start at"
  },
  {
    "path": ".gitignore",
    "chars": 102,
    "preview": ".Rproj.user\n.Rhistory\n.RData\n/tags\n/tests/testthat/github-token.txt\n/r-packages\n/revdep\ndocs\ninst/doc\n"
  },
  {
    "path": ".vscode/extensions.json",
    "chars": 62,
    "preview": "{\n    \"recommendations\": [\n        \"Posit.air-vscode\"\n    ]\n}\n"
  },
  {
    "path": ".vscode/settings.json",
    "chars": 153,
    "preview": "{\n    \"[r]\": {\n        \"editor.formatOnSave\": true,\n        \"editor.defaultFormatter\": \"Posit.air-vscode\"\n    },\n    \"ma"
  },
  {
    "path": "DESCRIPTION",
    "chars": 1068,
    "preview": "Package: gh\nTitle: 'GitHub' 'API'\nVersion: 1.5.0.9000\nAuthors@R: c(\n    person(\"Gábor\", \"Csárdi\", , \"csardi.gabor@gmail."
  },
  {
    "path": "LICENSE",
    "chars": 40,
    "preview": "YEAR: 2025\nCOPYRIGHT HOLDER: gh authors\n"
  },
  {
    "path": "LICENSE.md",
    "chars": 1069,
    "preview": "# MIT License\n\nCopyright (c) 2025 gh authors\n\nPermission is hereby granted, free of charge, to any person obtaining a co"
  },
  {
    "path": "Makefile",
    "chars": 109,
    "preview": "\nall: README.md\n\nREADME.md: README.Rmd\n\tRscript -e \"library(knitr); knit('$<', output = '$@', quiet = TRUE)\"\n"
  },
  {
    "path": "NAMESPACE",
    "chars": 734,
    "preview": "# Generated by roxygen2: do not edit by hand\n\nS3method(format,gh_pat)\nS3method(print,gh_pat)\nS3method(print,gh_response)"
  },
  {
    "path": "NEWS.md",
    "chars": 5437,
    "preview": "# gh (development version)\n\n# gh 1.5.0\n\n## BREAKING CHANGES\n\n### Posit Security Advisory(PSA) - PSA-1649\n\n* Posit acknow"
  },
  {
    "path": "R/gh-package.R",
    "chars": 437,
    "preview": "#' @keywords internal\n#' @aliases gh-package\n\"_PACKAGE\"\n\n# The following block is used by usethis to automatically manag"
  },
  {
    "path": "R/gh.R",
    "chars": 14033,
    "preview": "#' Query the GitHub API\n#'\n#' This is an extremely minimal client. You need to know the API\n#' to be able to use this cl"
  },
  {
    "path": "R/gh_gql.R",
    "chars": 925,
    "preview": "#' A simple interface for the GitHub GraphQL API v4.\n#'\n#' See more about the GraphQL API here:\n#' <https://docs.github."
  },
  {
    "path": "R/gh_rate_limit.R",
    "chars": 2184,
    "preview": "#' Return GitHub user's current rate limits\n#'\n#' @description\n#' `gh_rate_limits()` reports on all rate limits for the "
  },
  {
    "path": "R/gh_request.R",
    "chars": 6082,
    "preview": "## Main API URL\ndefault_api_url <- function() {\n  Sys.getenv(\"GITHUB_API_URL\", unset = \"https://api.github.com\")\n}\n\n## H"
  },
  {
    "path": "R/gh_response.R",
    "chars": 1760,
    "preview": "gh_process_response <- function(resp, gh_req) {\n  stopifnot(inherits(resp, \"httr2_response\"))\n\n  content_type <- httr2::"
  },
  {
    "path": "R/gh_token.R",
    "chars": 4803,
    "preview": "#' Return the local user's GitHub Personal Access Token (PAT)\n#'\n#' @description\n#' If gh can find a personal access tok"
  },
  {
    "path": "R/gh_whoami.R",
    "chars": 2275,
    "preview": "#' Info on current GitHub user and token\n#'\n#' Reports wallet name, GitHub login, and GitHub URL for the current\n#' auth"
  },
  {
    "path": "R/git.R",
    "chars": 2536,
    "preview": "#' Find the GitHub remote associated with a path\n#'\n#' This is handy helper if you want to make gh requests related to t"
  },
  {
    "path": "R/import-standalone-purrr.R",
    "chars": 5648,
    "preview": "# Standalone file: do not edit by hand\n# Source: <https://github.com/r-lib/rlang/blob/main/R/standalone-purrr.R>\n# -----"
  },
  {
    "path": "R/pagination.R",
    "chars": 3011,
    "preview": "extract_link <- function(gh_response, link) {\n  headers <- attr(gh_response, \"response\")\n  links <- headers$link\n  if (i"
  },
  {
    "path": "R/print.R",
    "chars": 449,
    "preview": "#' Print the result of a GitHub API call\n#'\n#' @param x The result object.\n#' @param ... Ignored.\n#' @return The JSON re"
  },
  {
    "path": "R/utils.R",
    "chars": 2561,
    "preview": "trim_ws <- function(x) {\n  sub(\"\\\\s*$\", \"\", sub(\"^\\\\s*\", \"\", x))\n}\n\n## from devtools, among other places\ncompact <- func"
  },
  {
    "path": "README.Rmd",
    "chars": 4053,
    "preview": "---\noutput: github_document\n---\n\n<!-- README.md is generated from README.Rmd. Please edit that file -->\n\n```{r}\n#| label"
  },
  {
    "path": "README.md",
    "chars": 5623,
    "preview": "\n<!-- README.md is generated from README.Rmd. Please edit that file -->\n\n# gh\n\n<!-- badges: start -->\n\n[![R-CMD-check](h"
  },
  {
    "path": "_pkgdown.yml",
    "chars": 423,
    "preview": "url: https://gh.r-lib.org\n\ntemplate:\n  package: tidytemplate\n  bootstrap: 5\n  includes:\n    in_header: |\n      <script s"
  },
  {
    "path": "air.toml",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "codecov.yml",
    "chars": 232,
    "preview": "comment: false\n\ncoverage:\n  status:\n    project:\n      default:\n        target: auto\n        threshold: 1%\n        infor"
  },
  {
    "path": "gh.Rproj",
    "chars": 384,
    "preview": "Version: 1.0\n\nRestoreWorkspace: No\nSaveWorkspace: No\nAlwaysSaveHistory: Default\n\nEnableCodeIndexing: Yes\nUseSpacesForTab"
  },
  {
    "path": "inst/WORDLIST",
    "chars": 176,
    "preview": "CMD\nCodecov\nGithub\nGraphQL\nJSON\nLastPass\nMinimalistic\nPATs\nPBC\nPSA\nROR\nURI\napi\nauth\ndiscoverable\nfunder\ngitcreds\ngithub\n"
  },
  {
    "path": "man/gh-package.Rd",
    "chars": 700,
    "preview": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/gh-package.R\n\\docType{package}\n\\name{gh-pa"
  },
  {
    "path": "man/gh.Rd",
    "chars": 8788,
    "preview": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/gh.R\n\\name{gh}\n\\alias{gh}\n\\title{Query the"
  },
  {
    "path": "man/gh_gql.Rd",
    "chars": 1539,
    "preview": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/gh_gql.R\n\\name{gh_gql}\n\\alias{gh_gql}\n\\tit"
  },
  {
    "path": "man/gh_next.Rd",
    "chars": 1677,
    "preview": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/pagination.R\n\\name{gh_next}\n\\alias{gh_next"
  },
  {
    "path": "man/gh_rate_limit.Rd",
    "chars": 1554,
    "preview": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/gh_rate_limit.R\n\\name{gh_rate_limit}\n\\alia"
  },
  {
    "path": "man/gh_token.Rd",
    "chars": 2319,
    "preview": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/gh_token.R\n\\name{gh_token}\n\\alias{gh_token"
  },
  {
    "path": "man/gh_tree_remote.Rd",
    "chars": 664,
    "preview": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/git.R\n\\name{gh_tree_remote}\n\\alias{gh_tree"
  },
  {
    "path": "man/gh_whoami.Rd",
    "chars": 2466,
    "preview": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/gh_whoami.R\n\\name{gh_whoami}\n\\alias{gh_who"
  },
  {
    "path": "man/print.gh_response.Rd",
    "chars": 375,
    "preview": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/print.R\n\\name{print.gh_response}\n\\alias{pr"
  },
  {
    "path": "tests/testthat/_snaps/gh.md",
    "chars": 471,
    "preview": "# generates a useful message\n\n    Code\n      gh(\"/missing\")\n    Condition\n      Error in `gh()`:\n      ! GitHub API erro"
  },
  {
    "path": "tests/testthat/_snaps/gh_rate_limit.md",
    "chars": 333,
    "preview": "# errors\n\n    Code\n      gh_rate_limit(list())\n    Condition\n      Error in `gh_rate_limit()`:\n      ! inherits(response"
  },
  {
    "path": "tests/testthat/_snaps/gh_request.md",
    "chars": 263,
    "preview": "# gh_set_endpoint() refuses to substitute an NA\n\n    Code\n      gh_set_endpoint(input)\n    Condition\n      Error in `gh_"
  },
  {
    "path": "tests/testthat/_snaps/gh_response.md",
    "chars": 150,
    "preview": "# warns if output is HTML\n\n    Code\n      res <- gh(\"POST /markdown\", text = \"foo\")\n    Condition\n      Warning:\n      R"
  },
  {
    "path": "tests/testthat/_snaps/gh_token.md",
    "chars": 300,
    "preview": "# get_baseurl() insists on http(s)\n\n    Code\n      get_baseurl(\"github.com\")\n    Condition\n      Error in `get_baseurl()"
  },
  {
    "path": "tests/testthat/_snaps/gh_whoami.md",
    "chars": 667,
    "preview": "# whoami errors with bad/absent PAT\n\n    Code\n      gh_whoami(.token = \"\")\n    Message\n      No personal access token (P"
  },
  {
    "path": "tests/testthat/_snaps/pagination.md",
    "chars": 132,
    "preview": "# can extract relative pages\n\n    Code\n      gh_prev(page1)\n    Condition\n      Error in `gh_link_request()`:\n      ! No"
  },
  {
    "path": "tests/testthat/_snaps/print.md",
    "chars": 1535,
    "preview": "# can print all types of object\n\n    Code\n      json\n    Output\n      {\n        \"name\": \"LICENSE\",\n        \"path\": \"LICE"
  },
  {
    "path": "tests/testthat/_snaps/utils.md",
    "chars": 436,
    "preview": "# named NA is error\n\n    Code\n      check_named_nas(tc)\n    Condition\n      Error in `check_named_nas()`:\n      ! Named "
  },
  {
    "path": "tests/testthat/helper-offline.R",
    "chars": 556,
    "preview": "skip_if_no_github <- function(has_scope = NULL) {\n  skip_if_offline(\"github.com\")\n  skip_on_cran()\n\n  if (gh_token() == "
  },
  {
    "path": "tests/testthat/helper.R",
    "chars": 431,
    "preview": "test_package_root <- function() {\n  x <- tryCatch(\n    rprojroot::find_package_root_file(),\n    error = function(e) NULL"
  },
  {
    "path": "tests/testthat/setup.R",
    "chars": 86,
    "preview": "withr::local_options(\n  gh_cache = FALSE,\n  .local_envir = testthat::teardown_env()\n)\n"
  },
  {
    "path": "tests/testthat/test-gh.R",
    "chars": 1879,
    "preview": "test_that(\"generates a useful message\", {\n  skip_if_no_github()\n\n  expect_snapshot(gh(\"/missing\"), error = TRUE)\n})\n\ntes"
  },
  {
    "path": "tests/testthat/test-gh_rate_limit.R",
    "chars": 841,
    "preview": "test_that(\"good input\", {\n  mock_res <- structure(\n    list(),\n    class = \"gh_response\",\n    response = list(\n      \"x-"
  },
  {
    "path": "tests/testthat/test-gh_request.R",
    "chars": 3794,
    "preview": "test_that(\"all forms of specifying endpoint are equivalent\", {\n  r1 <- gh_build_request(\"GET /rate_limit\")\n  expect_equa"
  },
  {
    "path": "tests/testthat/test-gh_response.R",
    "chars": 2531,
    "preview": "test_that(\"works with empty bodies\", {\n  skip_if_no_github()\n\n  out <- gh(\"GET /orgs/{org}/repos\", org = \"gh-org-testing"
  },
  {
    "path": "tests/testthat/test-gh_token.R",
    "chars": 5848,
    "preview": "test_that(\"URL specific token is used\", {\n  good <- gh_pat(strrep(\"a\", 40))\n  good2 <- gh_pat(strrep(\"b\", 40))\n  bad <- "
  },
  {
    "path": "tests/testthat/test-gh_whoami.R",
    "chars": 515,
    "preview": "test_that(\"whoami works in presence of PAT\", {\n  skip_if_no_github(has_scope = \"user\")\n\n  res <- gh_whoami()\n  expect_s3"
  },
  {
    "path": "tests/testthat/test-git.R",
    "chars": 958,
    "preview": "test_that(\"picks origin if available\", {\n  remotes <- list(\n    upstream = \"https://github.com/x/1\",\n    origin = \"https"
  },
  {
    "path": "tests/testthat/test-mock-repos.R",
    "chars": 1188,
    "preview": "if (!exists(\"TMPL\", environment(), inherits = FALSE)) {\n  TMPL <- function(x) x\n}\n\ntest_that(\"repos, some basics\", {\n  s"
  },
  {
    "path": "tests/testthat/test-old-templates.R",
    "chars": 102,
    "preview": "TMPL <- function(x) {\n  gsub(\"[{]([^}]+)[}]\", \":\\\\1\", x)\n}\n\nsource(\"test-mock-repos.R\", local = TRUE)\n"
  },
  {
    "path": "tests/testthat/test-pagination.R",
    "chars": 1045,
    "preview": "test_that(\"can extract relative pages\", {\n  skip_on_cran()\n  page1 <- gh(\"/orgs/tidyverse/repos\", per_page = 1)\n  expect"
  },
  {
    "path": "tests/testthat/test-print.R",
    "chars": 631,
    "preview": "test_that(\"can print all types of object\", {\n  skip_on_cran()\n  local_options(gh_cache = FALSE)\n\n  get_license <- functi"
  },
  {
    "path": "tests/testthat/test-spelling.R",
    "chars": 324,
    "preview": "test_that(\"spelling\", {\n  skip_on_cran()\n  skip_on_covr()\n  pkgroot <- test_package_root()\n  err <- spelling::spell_chec"
  },
  {
    "path": "tests/testthat/test-utils.R",
    "chars": 1938,
    "preview": "test_that(\"can detect presence vs absence names\", {\n  expect_identical(has_name(list(\"foo\", \"bar\")), c(FALSE, FALSE))\n  "
  },
  {
    "path": "tests/testthat.R",
    "chars": 92,
    "preview": "library(testthat)\nlibrary(gh)\n\nif (Sys.getenv(\"NOT_CRAN\") == \"true\") {\n  test_check(\"gh\")\n}\n"
  },
  {
    "path": "vignettes/.gitignore",
    "chars": 11,
    "preview": "*.html\n*.R\n"
  },
  {
    "path": "vignettes/managing-personal-access-tokens.Rmd",
    "chars": 7276,
    "preview": "---\ntitle: \"Managing Personal Access Tokens\"\noutput: rmarkdown::html_vignette\nvignette: >\n  %\\VignetteIndexEntry{Managin"
  }
]

About this extraction

This page contains the full source code of the r-lib/gh GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 75 files (133.1 KB), approximately 40.6k tokens. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!