[
  {
    "path": ".Rbuildignore",
    "content": "^.*\\.Rproj$\n^\\.Rproj\\.user$\n^Makefile$\n^README.Rmd$\n^.travis.yml$\n^appveyor.yml$\n^tags$\n^tests/testthat/github-token\\.txt$\n^CONTRIBUTING\\.md$\n^tests/testthat/httrmock$\n^\\.github$\n^\\.Rprofile$\n^r-packages$\n^revdep$\n^_pkgdown\\.yml$\n^docs$\n^pkgdown$\n^codecov\\.yml$\n^LICENSE\\.md$\n^[\\.]?air\\.toml$\n^\\.vscode$\n"
  },
  {
    "path": ".github/.gitignore",
    "content": "*.html\n"
  },
  {
    "path": ".github/CODEOWNERS",
    "content": "# CODEOWNERS for gh\n# https://www.tidyverse.org/development/understudies\n.github/CODEOWNERS @gaborcsardi @jennybc\n"
  },
  {
    "path": ".github/CODE_OF_CONDUCT.md",
    "content": "# Contributor Covenant Code of Conduct\n\n## Our Pledge\n\nWe as members, contributors, and leaders pledge to make participation in our\ncommunity a harassment-free experience for everyone, regardless of age, body\nsize, visible or invisible disability, ethnicity, sex characteristics, gender\nidentity and expression, level of experience, education, socio-economic status,\nnationality, personal appearance, race, caste, color, religion, or sexual\nidentity and orientation.\n\nWe pledge to act and interact in ways that contribute to an open, welcoming,\ndiverse, inclusive, and healthy community.\n\n## Our Standards\n\nExamples of behavior that contributes to a positive environment for our\ncommunity include:\n\n* Demonstrating empathy and kindness toward other people\n* Being respectful of differing opinions, viewpoints, and experiences\n* Giving and gracefully accepting constructive feedback\n* Accepting responsibility and apologizing to those affected by our mistakes,\n  and learning from the experience\n* Focusing on what is best not just for us as individuals, but for the overall\n  community\n\nExamples of unacceptable behavior include:\n\n* The use of sexualized language or imagery, and sexual attention or advances of\n  any kind\n* Trolling, insulting or derogatory comments, and personal or political attacks\n* Public or private harassment\n* Publishing others' private information, such as a physical or email address,\n  without their explicit permission\n* Other conduct which could reasonably be considered inappropriate in a\n  professional setting\n\n## Enforcement Responsibilities\n\nCommunity leaders are responsible for clarifying and enforcing our standards of\nacceptable behavior and will take appropriate and fair corrective action in\nresponse to any behavior that they deem inappropriate, threatening, offensive,\nor harmful.\n\nCommunity leaders have the right and responsibility to remove, edit, or reject\ncomments, commits, code, wiki edits, issues, and other contributions that are\nnot aligned to this Code of Conduct, and will communicate reasons for moderation\ndecisions when appropriate.\n\n## Scope\n\nThis Code of Conduct applies within all community spaces, and also applies when\nan individual is officially representing the community in public spaces.\nExamples of representing our community include using an official e-mail address,\nposting via an official social media account, or acting as an appointed\nrepresentative at an online or offline event.\n\n## Enforcement\n\nInstances of abusive, harassing, or otherwise unacceptable behavior may be\nreported to the community leaders responsible for enforcement at codeofconduct@posit.co. \nAll complaints will be reviewed and investigated promptly and fairly.\n\nAll community leaders are obligated to respect the privacy and security of the\nreporter of any incident.\n\n## Enforcement Guidelines\n\nCommunity leaders will follow these Community Impact Guidelines in determining\nthe consequences for any action they deem in violation of this Code of Conduct:\n\n### 1. Correction\n\n**Community Impact**: Use of inappropriate language or other behavior deemed\nunprofessional or unwelcome in the community.\n\n**Consequence**: A private, written warning from community leaders, providing\nclarity around the nature of the violation and an explanation of why the\nbehavior was inappropriate. A public apology may be requested.\n\n### 2. Warning\n\n**Community Impact**: A violation through a single incident or series of\nactions.\n\n**Consequence**: A warning with consequences for continued behavior. No\ninteraction with the people involved, including unsolicited interaction with\nthose enforcing the Code of Conduct, for a specified period of time. This\nincludes avoiding interactions in community spaces as well as external channels\nlike social media. Violating these terms may lead to a temporary or permanent\nban.\n\n### 3. Temporary Ban\n\n**Community Impact**: A serious violation of community standards, including\nsustained inappropriate behavior.\n\n**Consequence**: A temporary ban from any sort of interaction or public\ncommunication with the community for a specified period of time. No public or\nprivate interaction with the people involved, including unsolicited interaction\nwith those enforcing the Code of Conduct, is allowed during this period.\nViolating these terms may lead to a permanent ban.\n\n### 4. Permanent Ban\n\n**Community Impact**: Demonstrating a pattern of violation of community\nstandards, including sustained inappropriate behavior, harassment of an\nindividual, or aggression toward or disparagement of classes of individuals.\n\n**Consequence**: A permanent ban from any sort of public interaction within the\ncommunity.\n\n## Attribution\n\nThis Code of Conduct is adapted from the [Contributor Covenant][homepage],\nversion 2.1, available at\n<https://www.contributor-covenant.org/version/2/1/code_of_conduct.html>.\n\nCommunity Impact Guidelines were inspired by\n[Mozilla's code of conduct enforcement ladder][https://github.com/mozilla/inclusion].\n\nFor answers to common questions about this code of conduct, see the FAQ at\n<https://www.contributor-covenant.org/faq>. Translations are available at <https://www.contributor-covenant.org/translations>.\n\n[homepage]: https://www.contributor-covenant.org\n"
  },
  {
    "path": ".github/workflows/R-CMD-check.yaml",
    "content": "# Workflow derived from https://github.com/r-lib/actions/tree/v2/examples\n# Need help debugging build failures? Start at https://github.com/r-lib/actions#where-to-find-help\n#\n# NOTE: This workflow is overkill for most R packages and\n# check-standard.yaml is likely a better choice.\n# usethis::use_github_action(\"check-standard\") will install it.\non:\n  push:\n    branches: [main, master]\n  pull_request:\n\nname: R-CMD-check.yaml\n\npermissions: read-all\n\njobs:\n  R-CMD-check:\n    runs-on: ${{ matrix.config.os }}\n\n    name: ${{ matrix.config.os }} (${{ matrix.config.r }})\n\n    strategy:\n      fail-fast: false\n      matrix:\n        config:\n          - {os: macos-latest,   r: 'release'}\n\n          - {os: windows-latest, r: 'release'}\n          # use 4.0 or 4.1 to check with rtools40's older compiler\n          - {os: windows-latest, r: 'oldrel-4'}\n\n          - {os: ubuntu-latest,  r: 'devel', http-user-agent: 'release'}\n          - {os: ubuntu-latest,  r: 'release'}\n          - {os: ubuntu-latest,  r: 'oldrel-1'}\n          - {os: ubuntu-latest,  r: 'oldrel-2'}\n          - {os: ubuntu-latest,  r: 'oldrel-3'}\n          - {os: ubuntu-latest,  r: 'oldrel-4'}\n\n    env:\n      GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}\n      R_KEEP_PKG_SOURCE: yes\n\n    steps:\n      - uses: actions/checkout@v4\n\n      - uses: r-lib/actions/setup-pandoc@v2\n\n      - uses: r-lib/actions/setup-r@v2\n        with:\n          r-version: ${{ matrix.config.r }}\n          http-user-agent: ${{ matrix.config.http-user-agent }}\n          use-public-rspm: true\n\n      - uses: r-lib/actions/setup-r-dependencies@v2\n        with:\n          extra-packages: any::rcmdcheck\n          needs: check\n\n      - uses: r-lib/actions/check-r-package@v2\n        with:\n          upload-snapshots: true\n          build_args: 'c(\"--no-manual\",\"--compact-vignettes=gs+qpdf\")'\n"
  },
  {
    "path": ".github/workflows/pkgdown.yaml",
    "content": "# Workflow derived from https://github.com/r-lib/actions/tree/v2/examples\n# Need help debugging build failures? Start at https://github.com/r-lib/actions#where-to-find-help\non:\n  push:\n    branches: [main, master]\n  pull_request:\n  release:\n    types: [published]\n  workflow_dispatch:\n\nname: pkgdown.yaml\n\npermissions: read-all\n\njobs:\n  pkgdown:\n    runs-on: ubuntu-latest\n    # Only restrict concurrency for non-PR jobs\n    concurrency:\n      group: pkgdown-${{ github.event_name != 'pull_request' || github.run_id }}\n    env:\n      GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}\n    permissions:\n      contents: write\n    steps:\n      - uses: actions/checkout@v4\n\n      - uses: r-lib/actions/setup-pandoc@v2\n\n      - uses: r-lib/actions/setup-r@v2\n        with:\n          use-public-rspm: true\n\n      - uses: r-lib/actions/setup-r-dependencies@v2\n        with:\n          extra-packages: any::pkgdown, local::.\n          needs: website\n\n      - name: Build site\n        run: pkgdown::build_site_github_pages(new_process = FALSE, install = FALSE)\n        shell: Rscript {0}\n\n      - name: Deploy to GitHub pages 🚀\n        if: github.event_name != 'pull_request'\n        uses: JamesIves/github-pages-deploy-action@v4.5.0\n        with:\n          clean: false\n          branch: gh-pages\n          folder: docs\n"
  },
  {
    "path": ".github/workflows/pr-commands.yaml",
    "content": "# Workflow derived from https://github.com/r-lib/actions/tree/v2/examples\n# Need help debugging build failures? Start at https://github.com/r-lib/actions#where-to-find-help\non:\n  issue_comment:\n    types: [created]\n\nname: pr-commands.yaml\n\npermissions: read-all\n\njobs:\n  document:\n    if: ${{ github.event.issue.pull_request && (github.event.comment.author_association == 'MEMBER' || github.event.comment.author_association == 'OWNER') && startsWith(github.event.comment.body, '/document') }}\n    name: document\n    runs-on: ubuntu-latest\n    env:\n      GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}\n    permissions:\n      contents: write\n    steps:\n      - uses: actions/checkout@v4\n\n      - uses: r-lib/actions/pr-fetch@v2\n        with:\n          repo-token: ${{ secrets.GITHUB_TOKEN }}\n\n      - uses: r-lib/actions/setup-r@v2\n        with:\n          use-public-rspm: true\n\n      - uses: r-lib/actions/setup-r-dependencies@v2\n        with:\n          extra-packages: any::roxygen2\n          needs: pr-document\n\n      - name: Document\n        run: roxygen2::roxygenise()\n        shell: Rscript {0}\n\n      - name: commit\n        run: |\n          git config --local user.name \"$GITHUB_ACTOR\"\n          git config --local user.email \"$GITHUB_ACTOR@users.noreply.github.com\"\n          git add man/\\* NAMESPACE\n          git commit -m 'Document'\n\n      - uses: r-lib/actions/pr-push@v2\n        with:\n          repo-token: ${{ secrets.GITHUB_TOKEN }}\n\n  style:\n    if: ${{ github.event.issue.pull_request && (github.event.comment.author_association == 'MEMBER' || github.event.comment.author_association == 'OWNER') && startsWith(github.event.comment.body, '/style') }}\n    name: style\n    runs-on: ubuntu-latest\n    env:\n      GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}\n    permissions:\n      contents: write\n    steps:\n      - uses: actions/checkout@v4\n\n      - uses: r-lib/actions/pr-fetch@v2\n        with:\n          repo-token: ${{ secrets.GITHUB_TOKEN }}\n\n      - uses: r-lib/actions/setup-r@v2\n\n      - name: Install dependencies\n        run: install.packages(\"styler\")\n        shell: Rscript {0}\n\n      - name: Style\n        run: styler::style_pkg()\n        shell: Rscript {0}\n\n      - name: commit\n        run: |\n          git config --local user.name \"$GITHUB_ACTOR\"\n          git config --local user.email \"$GITHUB_ACTOR@users.noreply.github.com\"\n          git add \\*.R\n          git commit -m 'Style'\n\n      - uses: r-lib/actions/pr-push@v2\n        with:\n          repo-token: ${{ secrets.GITHUB_TOKEN }}\n"
  },
  {
    "path": ".github/workflows/rhub.yaml",
    "content": "# R-hub's generic GitHub Actions workflow file. It's canonical location is at\n# https://github.com/r-hub/rhub2/blob/v1/inst/workflow/rhub.yaml\n# You can update this file to a newer version using the rhub2 package:\n#\n# rhub2::rhub_setup()\n#\n# It is unlikely that you need to modify this file manually.\n\nname: R-hub\nrun-name: \"${{ github.event.inputs.id }}: ${{ github.event.inputs.name || format('Manually run by {0}', github.triggering_actor) }}\"\n\non:\n  workflow_dispatch:\n    inputs:\n      config:\n        description: 'A comma separated list of R-hub platforms to use.'\n        type: string\n        default: 'linux,windows,macos'\n      name:\n        description: 'Run name. You can leave this empty now.'\n        type: string\n      id:\n        description: 'Unique ID. You can leave this empty now.'\n        type: string\n\njobs:\n\n  setup:\n    runs-on: ubuntu-latest\n    outputs:\n      containers: ${{ steps.rhub-setup.outputs.containers }}\n      platforms: ${{ steps.rhub-setup.outputs.platforms }}\n\n    steps:\n    # NO NEED TO CHECKOUT HERE\n    - uses: r-hub/rhub2/actions/rhub-setup@v1\n      with:\n        config: ${{ github.event.inputs.config }}\n      id: rhub-setup\n\n  linux-containers:\n    needs: setup\n    if: ${{ needs.setup.outputs.containers != '[]' }}\n    runs-on: ubuntu-latest\n    name: ${{ matrix.config.label }}\n    strategy:\n      fail-fast: false\n      matrix:\n        config: ${{ fromJson(needs.setup.outputs.containers) }}\n    container:\n      image: ${{ matrix.config.container }}\n\n    steps:\n      - uses: actions/checkout@v4\n      - uses: r-hub/rhub2/actions/rhub-check@v1\n        with:\n          token: ${{ secrets.RHUB_TOKEN }}\n          job-config: ${{ matrix.config.job-config }}\n\n  other-platforms:\n    needs: setup\n    if: ${{ needs.setup.outputs.platforms != '[]' }}\n    runs-on: ${{ matrix.config.os }}\n    name: ${{ matrix.config.label }}\n    strategy:\n      fail-fast: false\n      matrix:\n        config: ${{ fromJson(needs.setup.outputs.platforms) }}\n\n    steps:\n      - uses: actions/checkout@v4\n      - uses: r-hub/rhub2/actions/rhub-setup-r@v1\n        with:\n          job-config: ${{ matrix.config.job-config }}\n          token: ${{ secrets.RHUB_TOKEN }}\n      - uses: r-hub/rhub2/actions/rhub-check@v1\n        with:\n          job-config: ${{ matrix.config.job-config }}\n          token: ${{ secrets.RHUB_TOKEN }}\n"
  },
  {
    "path": ".github/workflows/test-coverage.yaml",
    "content": "# Workflow derived from https://github.com/r-lib/actions/tree/v2/examples\n# Need help debugging build failures? Start at https://github.com/r-lib/actions#where-to-find-help\non:\n  push:\n    branches: [main, master]\n  pull_request:\n\nname: test-coverage.yaml\n\npermissions: read-all\n\njobs:\n  test-coverage:\n    runs-on: ubuntu-latest\n    env:\n      GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}\n\n    steps:\n      - uses: actions/checkout@v4\n\n      - uses: r-lib/actions/setup-r@v2\n        with:\n          use-public-rspm: true\n\n      - uses: r-lib/actions/setup-r-dependencies@v2\n        with:\n          extra-packages: any::covr, any::xml2\n          needs: coverage\n\n      - name: Test coverage\n        run: |\n          cov <- covr::package_coverage(\n            quiet = FALSE,\n            clean = FALSE,\n            install_path = file.path(normalizePath(Sys.getenv(\"RUNNER_TEMP\"), winslash = \"/\"), \"package\")\n          )\n          print(cov)\n          covr::to_cobertura(cov)\n        shell: Rscript {0}\n\n      - uses: codecov/codecov-action@v5\n        with:\n          # Fail if error if not on PR, or if on PR and token is given\n          fail_ci_if_error: ${{ github.event_name != 'pull_request' || secrets.CODECOV_TOKEN }}\n          files: ./cobertura.xml\n          plugins: noop\n          disable_search: true\n          token: ${{ secrets.CODECOV_TOKEN }}\n\n      - name: Show testthat output\n        if: always()\n        run: |\n          ## --------------------------------------------------------------------\n          find '${{ runner.temp }}/package' -name 'testthat.Rout*' -exec cat '{}' \\; || true\n        shell: bash\n\n      - name: Upload test results\n        if: failure()\n        uses: actions/upload-artifact@v4\n        with:\n          name: coverage-test-failures\n          path: ${{ runner.temp }}/package\n"
  },
  {
    "path": ".gitignore",
    "content": ".Rproj.user\n.Rhistory\n.RData\n/tags\n/tests/testthat/github-token.txt\n/r-packages\n/revdep\ndocs\ninst/doc\n"
  },
  {
    "path": ".vscode/extensions.json",
    "content": "{\n    \"recommendations\": [\n        \"Posit.air-vscode\"\n    ]\n}\n"
  },
  {
    "path": ".vscode/settings.json",
    "content": "{\n    \"[r]\": {\n        \"editor.formatOnSave\": true,\n        \"editor.defaultFormatter\": \"Posit.air-vscode\"\n    },\n    \"makefile.configureOnOpen\": false\n}\n"
  },
  {
    "path": "DESCRIPTION",
    "content": "Package: gh\nTitle: 'GitHub' 'API'\nVersion: 1.5.0.9000\nAuthors@R: c(\n    person(\"Gábor\", \"Csárdi\", , \"csardi.gabor@gmail.com\", role = c(\"cre\", \"ctb\")),\n    person(\"Jennifer\", \"Bryan\", role = \"aut\"),\n    person(\"Hadley\", \"Wickham\", role = \"aut\"),\n    person(\"Posit Software, PBC\", role = c(\"cph\", \"fnd\"),\n           comment = c(ROR = \"03wc8by49\"))\n  )\nDescription: Minimal client to access the 'GitHub' 'API'.\nLicense: MIT + file LICENSE\nURL: https://gh.r-lib.org/, https://github.com/r-lib/gh#readme\nBugReports: https://github.com/r-lib/gh/issues\nDepends:\n    R (>= 4.1)\nImports:\n    cli (>= 3.0.1),\n    gitcreds,\n    glue,\n    httr2 (>= 1.0.6),\n    ini,\n    jsonlite,\n    lifecycle,\n    rlang (>= 1.0.0)\nSuggests:\n    connectcreds,\n    covr,\n    knitr,\n    rmarkdown,\n    rprojroot,\n    spelling,\n    testthat (>= 3.0.0),\n    vctrs,\n    withr\nVignetteBuilder:\n    knitr\nConfig/Needs/website: tidyverse/tidytemplate\nConfig/testthat/edition: 3\nConfig/usethis/last-upkeep: 2025-04-29\nEncoding: UTF-8\nLanguage: en-US\nRoxygen: list(markdown = TRUE)\nRoxygenNote: 7.3.2.9000\n"
  },
  {
    "path": "LICENSE",
    "content": "YEAR: 2025\nCOPYRIGHT HOLDER: gh authors\n"
  },
  {
    "path": "LICENSE.md",
    "content": "# MIT License\n\nCopyright (c) 2025 gh authors\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
  },
  {
    "path": "Makefile",
    "content": "\nall: README.md\n\nREADME.md: README.Rmd\n\tRscript -e \"library(knitr); knit('$<', output = '$@', quiet = TRUE)\"\n"
  },
  {
    "path": "NAMESPACE",
    "content": "# Generated by roxygen2: do not edit by hand\n\nS3method(format,gh_pat)\nS3method(print,gh_pat)\nS3method(print,gh_response)\nS3method(str,gh_pat)\nS3method(vctrs::vec_cast,list.gh_response)\nS3method(vctrs::vec_ptype2,gh_response.gh_response)\nexport(gh)\nexport(gh_first)\nexport(gh_gql)\nexport(gh_last)\nexport(gh_next)\nexport(gh_prev)\nexport(gh_rate_limit)\nexport(gh_rate_limits)\nexport(gh_token)\nexport(gh_token_exists)\nexport(gh_tree_remote)\nexport(gh_whoami)\nimport(rlang)\nimportFrom(cli,cli_status)\nimportFrom(cli,cli_status_update)\nimportFrom(glue,glue)\nimportFrom(jsonlite,fromJSON)\nimportFrom(jsonlite,prettify)\nimportFrom(jsonlite,toJSON)\nimportFrom(lifecycle,deprecated)\nimportFrom(utils,URLencode)\nimportFrom(utils,capture.output)\n"
  },
  {
    "path": "NEWS.md",
    "content": "# gh (development version)\n\n# gh 1.5.0\n\n## BREAKING CHANGES\n\n### Posit Security Advisory(PSA) - PSA-1649\n\n* Posit acknowledges that the response header may contain sensitive\n  information. (#222) Thank you to @foysal1197 for your thorough research\n  and responsible disclosure.\n\n `gh()`, and other functions that use it, now do not save the request\n  headers in the returned object. Consequently, if you use the `gh_next()`,\n  `gh_prev()`, `gh_first()` or `gh_last()` functions and passed `.token`\n  and/or `.send_headers` explicitly to the original `gh()` (or similar)\n  call, then you'll also need to pass the same `.token` and/or\n  `.send_headers` to `gh_next()`, `gh_prev()`, `gh_first()` or `gh_last()`.\n\n## OTHER CHANGES\n\n* New `gh_token_exists()` tells you if a valid GH token has been set.\n\n* `gh()` now uses a cache provided by httr2. This cache lives in\n  `tools::R_user_dir(\"gh\", \"cache\")`, maxes out at 100 MB, and can be\n  disabled by setting `options(gh_cache = FALSE)` (#203).\n\n* `gh_token()` can now pick up on the viewer's GitHub credentials (if any)\n  when running on Posit Connect (@atheriel, #217).\n\n# gh 1.4.1\n\n* `gh_next()`, `gh_prev()`, `gh_first()` and `gh_last()`\n  now work correctly again (#181).\n\n* When the user sets `.destfile` to write the response to disk, gh now\n  writes the output to a temporary file, which is then renamed to\n  `.destfile` after performing the request, or deleted on error (#178).\n\n# gh 1.4.0\n\n* `gh()` gains a new `.max_rate` parameter that sets the maximum number of\n  requests per second.\n\n* gh is now powered by httr2. This should generally have little impact on normal\n  operation but if a request fails, you can use `httr2::last_response()` and\n  `httr2::last_request()` to debug.\n\n* `gh()` gains a new `.max_wait` argument which gives the maximum number of\n  minutes to wait if you are rate limited (#67).\n\n* New `gh_rate_limits()` function reports on all rate limits for the active\n  user.\n\n* gh can now validate GitHub\n  [fine-grained](https://github.blog/security/application-security/introducing-fine-grained-personal-access-tokens-for-github/)\n  personal access tokens (@jvstein, #171).\n\n# gh 1.3.1\n\n* gh now accepts lower-case methods i.e. both `gh::gh(\"get /users/hadley/repos\")` and `gh::gh(\"GET /users/hadley/repos\")` work (@maelle, #167).\n\n* Response headers (`\"response_headers\"`) and response content\n  (`\"response_content\")` are now returned in error conditions so that error\n  handlers can use information, such as the rate limit reset header, when\n  handling `github_error`s (@gadenbuie, #117).\n\n# gh 1.3.0\n\n* gh now shows the correct number of records in its progress bar when\n  paginating (#147).\n\n* New `.params` argument in `gh()` to make it easier to pass parameters to\n  it programmatically (#140).\n\n# gh 1.2.1\n\n* Token validation accounts for the new format\n  [announced 2021-03-04 ](https://github.blog/changelog/2021-03-04-authentication-token-format-updates/)\n  and implemented on 2021-04-01 (#148, @fmichonneau).\n\n# gh 1.2.0\n\n* `gh_gql()` now passes all arguments to `gh()` (#124).\n\n* gh now handles responses from pagination better, and tries to properly\n  merge them (#136, @rundel).\n\n* gh can retrieve a PAT from the Git credential store, where the lookup is\n  based on the targeted API URL. This now uses the gitcreds package. The\n  environment variables consulted for URL-specific GitHub PATs have changed.\n  - For \"https://api.github.com\": `GITHUB_PAT_GITHUB_COM` now, instead of\n    `GITHUB_PAT_API_GITHUB_COM`\n  - For \"https://github.acme.com/api/v3\": `GITHUB_PAT_GITHUB_ACME_COM` now,\n    instead of `GITHUB_PAT_GITHUB_ACME_COM_API_V3`\n\n  See the documentation of the gitcreds package for details.\n\n* The keyring package is no longer used, in favor of the Git credential\n  store.\n\n* The documentation for the GitHub REST API has moved to\n  <https://docs.github.com/rest> and endpoints are now documented using\n  the URI template style of [RFC 6570](https://www.rfc-editor.org/rfc/rfc6570):\n  - Old: `GET /repos/:owner/:repo/issues`\n  - New: `GET /repos/{owner}/{repo}/issues`\n\n  gh accepts and prioritizes the new style. However, it still does parameter\n  substitution for the old style.\n\n* Fixed an error that occurred when calling `gh()` with `.progress = FALSE`\n  (@gadenbuie, #115).\n\n* `gh()` accepts named `NA` parameters that are destined for the request\n  body (#139).\n\n# gh 1.1.0\n\n* Raw responses from GitHub are now returned as raw vector.\n\n* Responses may be written to disk by providing a path in the `.destfile`\n  argument.\n\n* gh now sets `.Last.error` to the error object after an uncaught error,\n  and `.Last.error.trace` to the stack trace of the error.\n\n* `gh()` now silently drops named `NULL` parameters, and throws an\n  error for named `NA` parameters (#21, #84).\n\n* `gh()` now returns better values for empty responses, typically empty\n  lists or dictionaries (#66).\n\n* `gh()` now has an `.accept` argument to make it easier to set the\n  `Accept` HTTP header (#91).\n\n* New `gh_gql()` function to make it easier to work with the GitHub\n  GraphQL API.\n\n* gh now supports separate personal access tokens for GitHub Enterprise\n  sites. See `?gh_token` for details.\n\n* gh now supports storing your GitHub personal access tokens (PAT) in the\n  system keyring, via the keyring package. See `?gh_token` for details.\n\n* `gh()` can now POST raw data, which allows adding assets to releases (#56).\n\n# gh 1.0.1\n\nFirst public release.\n"
  },
  {
    "path": "R/gh-package.R",
    "content": "#' @keywords internal\n#' @aliases gh-package\n\"_PACKAGE\"\n\n# The following block is used by usethis to automatically manage\n# roxygen namespace tags. Modify with care!\n## usethis namespace: start\n#' @import rlang\n#' @importFrom cli cli_status cli_status_update\n#' @importFrom glue glue\n#' @importFrom jsonlite fromJSON toJSON\n#' @importFrom lifecycle deprecated\n#' @importFrom utils URLencode capture.output\n## usethis namespace: end\nNULL\n"
  },
  {
    "path": "R/gh.R",
    "content": "#' Query the GitHub API\n#'\n#' This is an extremely minimal client. You need to know the API\n#' to be able to use this client. All this function does is:\n#' * Try to substitute each listed parameter into `endpoint`, using the\n#'   `{parameter}` notation.\n#' * If a GET request (the default), then add all other listed parameters\n#'   as query parameters.\n#' * If not a GET request, then send the other parameters in the request\n#'   body, as JSON.\n#' * Convert the response to an R list using [jsonlite::fromJSON()].\n#'\n#' @param endpoint GitHub API endpoint. Must be one of the following forms:\n#'    * `METHOD path`, e.g. `GET /rate_limit`,\n#'    * `path`, e.g. `/rate_limit`,\n#'    * `METHOD url`, e.g. `GET https://api.github.com/rate_limit`,\n#'    * `url`, e.g. `https://api.github.com/rate_limit`.\n#'\n#'    If the method is not supplied, will use `.method`, which defaults\n#'    to `\"GET\"`.\n#' @param ... Name-value pairs giving API parameters. Will be matched into\n#'   `endpoint` placeholders, sent as query parameters in GET requests, and as a\n#'   JSON body of POST requests. If there is only one unnamed parameter, and it\n#'   is a raw vector, then it will not be JSON encoded, but sent as raw data, as\n#'   is. This can be used for example to add assets to releases. Named `NULL`\n#'   values are silently dropped. For GET requests, named `NA` values trigger an\n#'   error. For other methods, named `NA` values are included in the body of the\n#'   request, as JSON `null`.\n#' @param per_page,.per_page Number of items to return per page. If omitted,\n#'   will be substituted by `max(.limit, 100)` if `.limit` is set,\n#'   otherwise determined by the API (never greater than 100).\n#' @param .destfile Path to write response to disk. If `NULL` (default),\n#'   response will be processed and returned as an object. If path is given,\n#'   response will be written to disk in the form sent. gh writes the\n#'   response to a temporary file, and renames that file to `.destfile`\n#'   after the request was successful. The name of the temporary file is\n#'   created by adding a `-<random>.gh-tmp` suffix to it, where `<random>`\n#'   is an ASCII string with random characters. gh removes the temporary\n#'   file on error.\n#' @param .overwrite If `.destfile` is provided, whether to overwrite an\n#'   existing file.  Defaults to `FALSE`. If an error happens the original\n#'   file is kept.\n#' @param .token Authentication token. Defaults to [gh_token()].\n#' @param .api_url Github API url (default: <https://api.github.com>). Used\n#'   if `endpoint` just contains a path. Defaults to `GITHUB_API_URL`\n#'   environment variable if set.\n#' @param .method HTTP method to use if not explicitly supplied in the\n#'    `endpoint`.\n#' @param .limit Number of records to return. This can be used\n#'   instead of manual pagination. By default it is `NULL`,\n#'   which means that the defaults of the GitHub API are used.\n#'   You can set it to a number to request more (or less)\n#'   records, and also to `Inf` to request all records.\n#'   Note, that if you request many records, then multiple GitHub\n#'   API calls are used to get them, and this can take a potentially\n#'   long time.\n#' @param .accept The value of the `Accept` HTTP header. Defaults to\n#'   `\"application/vnd.github.v3+json\"` . If `Accept` is given in\n#'   `.send_headers`, then that will be used. This parameter can be used to\n#'   provide a custom media type, in order to access a preview feature of\n#'   the API.\n#' @param .send_headers Named character vector of header field values\n#'   (except `Authorization`, which is handled via `.token`). This can be\n#'   used to override or augment the default `User-Agent` header:\n#'   `\"https://github.com/r-lib/gh\"`.\n#' @param .progress Whether to show a progress indicator for calls that\n#'   need more than one HTTP request.\n#' @param .params Additional list of parameters to append to `...`.\n#'   It is easier to use this than `...` if you have your parameters in\n#'   a list already.\n#' @param .max_wait Maximum number of seconds to wait if rate limited.\n#'   Defaults to 10 minutes.\n#' @param .max_rate Maximum request rate in requests per second. Set\n#'   this to automatically throttle requests.\n#' @return Answer from the API as a `gh_response` object, which is also a\n#'   `list`. Failed requests will generate an R error. Requests that\n#'   generate a raw response will return a raw vector.\n#'\n#' @export\n#' @seealso [gh_gql()] if you want to use the GitHub GraphQL API,\n#' [gh_whoami()] for details on GitHub API token management.\n#' @examplesIf identical(Sys.getenv(\"IN_PKGDOWN\"), \"true\")\n#' ## Repositories of a user, these are equivalent\n#' gh(\"/users/hadley/repos\", .limit = 2)\n#' gh(\"/users/{username}/repos\", username = \"hadley\", .limit = 2)\n#'\n#' ## Starred repositories of a user\n#' gh(\"/users/hadley/starred\", .limit = 2)\n#' gh(\"/users/{username}/starred\", username = \"hadley\", .limit = 2)\n#' @examplesIf FALSE\n#' ## Create a repository, needs a token (see gh_token())\n#' gh(\"POST /user/repos\", name = \"foobar\")\n#' @examplesIf identical(Sys.getenv(\"IN_PKGDOWN\"), \"true\")\n#' ## Issues of a repository\n#' gh(\"/repos/hadley/dplyr/issues\")\n#' gh(\"/repos/{owner}/{repo}/issues\", owner = \"hadley\", repo = \"dplyr\")\n#'\n#' ## Automatic pagination\n#' users <- gh(\"/users\", .limit = 50)\n#' length(users)\n#' @examplesIf FALSE\n#' ## Access developer preview of Licenses API (in preview as of 2015-09-24)\n#' gh(\"/licenses\") # used to error code 415\n#' gh(\"/licenses\", .accept = \"application/vnd.github.drax-preview+json\")\n#' @examplesIf FALSE\n#' ## Access Github Enterprise API\n#' ## Use GITHUB_API_URL environment variable to change the default.\n#' gh(\"/user/repos\", type = \"public\", .api_url = \"https://github.foobar.edu/api/v3\")\n#' @examplesIf FALSE\n#' ## Use I() to force body part to be sent as an array, even if length 1\n#' ## This works whether assignees has length 1 or > 1\n#' assignees <- \"gh_user\"\n#' assignees <- c(\"gh_user1\", \"gh_user2\")\n#' gh(\"PATCH /repos/OWNER/REPO/issues/1\", assignees = I(assignees))\n#' @examplesIf FALSE\n#' ## There are two ways to send JSON data. One is that you supply one or\n#' ## more objects that will be converted to JSON automatically via\n#' ## jsonlite::toJSON(). In this case sometimes you need to use\n#' ## jsonlite::unbox() because fromJSON() creates lists from scalar vectors\n#' ## by default. The Content-Type header is automatically added in this\n#' ## case. For example this request turns on GitHub Pages, using this\n#' ## API: https://docs.github.com/v3/repos/pages/#enable-a-pages-site\n#'\n#' gh::gh(\n#'   \"POST /repos/{owner}/{repo}/pages\",\n#'   owner = \"r-lib\",\n#'   repo = \"gh\",\n#'   source = list(\n#'     branch = jsonlite::unbox(\"gh-pages\"),\n#'     path = jsonlite::unbox(\"/\")\n#'   ),\n#'   .send_headers = c(Accept = \"application/vnd.github.switcheroo-preview+json\")\n#' )\n#'\n#' ## The second way is to handle the JSON encoding manually, and supply it\n#' ## as a raw vector in an unnamed argument, and also a Content-Type header:\n#'\n#' body <- '{ \"source\": { \"branch\": \"gh-pages\", \"path\": \"/\" } }'\n#' gh::gh(\n#'   \"POST /repos/{owner}/{repo}/pages\",\n#'   owner = \"r-lib\",\n#'   repo = \"gh\",\n#'   charToRaw(body),\n#'   .send_headers = c(\n#'     Accept = \"application/vnd.github.switcheroo-preview+json\",\n#'     \"Content-Type\" = \"application/json\"\n#'   )\n#' )\n#' @examplesIf FALSE\n#' ## Pass along a query to the search/code endpoint via the ... argument\n#' x <- gh::gh(\n#'             \"/search/code\",\n#'             q = \"installation repo:r-lib/gh\",\n#'             .send_headers = c(\"X-GitHub-Api-Version\" = \"2022-11-28\")\n#'             )\n#'  str(x, list.len = 3, give.attr = FALSE)\n#'\n#'\ngh <- function(\n  endpoint,\n  ...,\n  per_page = NULL,\n  .per_page = NULL,\n  .token = NULL,\n  .destfile = NULL,\n  .overwrite = FALSE,\n  .api_url = NULL,\n  .method = \"GET\",\n  .limit = NULL,\n  .accept = \"application/vnd.github.v3+json\",\n  .send_headers = NULL,\n  .progress = TRUE,\n  .params = list(),\n  .max_wait = 600,\n  .max_rate = NULL\n) {\n  params <- .parse_params(..., .params = .params)\n\n  check_exclusive(per_page, .per_page, .require = FALSE)\n  per_page <- per_page %||% .per_page\n  if (is.null(per_page) && !is.null(.limit)) {\n    per_page <- max(min(.limit, 100), 1)\n  }\n  if (!is.null(per_page)) {\n    params <- c(params, list(per_page = per_page))\n  }\n\n  req <- gh_build_request(\n    endpoint = endpoint,\n    params = params,\n    token = .token,\n    destfile = .destfile,\n    overwrite = .overwrite,\n    accept = .accept,\n    send_headers = .send_headers,\n    max_wait = .max_wait,\n    max_rate = .max_rate,\n    api_url = .api_url,\n    method = .method\n  )\n\n  if (req$method == \"GET\") check_named_nas(params)\n\n  raw <- gh_make_request(req)\n  res <- gh_process_response(raw, req)\n  len <- gh_response_length(res)\n\n  if (.progress && !is.null(.limit)) {\n    pages <- min(gh_extract_pages(res), ceiling(.limit / per_page))\n    cli::cli_progress_bar(\"Running gh query\", total = pages)\n    cli::cli_progress_update() # already done one\n  }\n\n  while (!is.null(.limit) && len < .limit && gh_has_next(res)) {\n    res2 <- gh_next(res, .token = .token, .send_headers = .send_headers)\n    len <- len + gh_response_length(res2)\n    if (.progress) cli::cli_progress_update()\n\n    if (!is.null(names(res2)) && identical(names(res), names(res2))) {\n      res3 <- mapply(\n        # Handle named array case\n        function(x, y, n) {\n          # e.g. GET /search/repositories\n          z <- c(x, y)\n          atm <- is.atomic(z)\n          if (atm && n %in% c(\"total_count\", \"incomplete_results\")) {\n            y\n          } else if (atm) {\n            unique(z)\n          } else {\n            z\n          }\n        },\n        res,\n        res2,\n        names(res),\n        SIMPLIFY = FALSE\n      )\n    } else {\n      # Handle unnamed array case\n      res3 <- c(res, res2) # e.g. GET /orgs/:org/invitations\n    }\n\n    attributes(res3) <- attributes(res2)\n    res <- res3\n  }\n\n  if (.progress) cli::cli_progress_done()\n\n  # We only subset for a non-named response.\n  if (\n    !is.null(.limit) &&\n      len > .limit &&\n      !\"total_count\" %in% names(res) &&\n      length(res) == len\n  ) {\n    res_attr <- attributes(res)\n    res <- res[seq_len(.limit)]\n    attributes(res) <- res_attr\n  }\n\n  res\n}\n\ngh_response_length <- function(res) {\n  if (\n    !is.null(names(res)) && length(res) > 1 && names(res)[1] == \"total_count\"\n  ) {\n    # Ignore total_count, incomplete_results, repository_selection\n    # and take the first list element to get the length\n    lst <- vapply(res, is.list, logical(1))\n    nm <- setdiff(\n      names(res),\n      c(\"total_count\", \"incomplete_results\", \"repository_selection\")\n    )\n    tgt <- which(lst[nm])[1]\n    if (is.na(tgt)) length(res) else length(res[[nm[tgt]]])\n  } else {\n    length(res)\n  }\n}\n\ngh_make_request <- function(x, error_call = caller_env()) {\n  if (!x$method %in% c(\"GET\", \"POST\", \"PATCH\", \"PUT\", \"DELETE\")) {\n    cli::cli_abort(\"Unknown HTTP verb: {.val {x$method}}\")\n  }\n\n  req <- httr2::request(x$url)\n  req <- httr2::req_method(req, x$method)\n  req <- httr2::req_url_query(req, !!!x$query)\n\n  if (!is.null((x$body))) {\n    if (is.raw(x$body)) {\n      req <- httr2::req_body_raw(req, x$body)\n    } else {\n      req <- httr2::req_body_json(req, x$body, null = \"list\", digits = 4)\n    }\n  }\n  req <- httr2::req_headers(req, !!!x$headers)\n\n  # Reduce connection timeout from curl's 10s default to 5s\n  req <- httr2::req_options(req, connecttimeout_ms = 5000)\n  if (Sys.getenv(\"GH_FORCE_HTTP_1_1\") == \"true\") {\n    req <- httr2::req_options(req, http_version = 2)\n  }\n\n  if (!isFALSE(getOption(\"gh_cache\"))) {\n    req <- httr2::req_cache(\n      req,\n      max_size = 100 * 1024 * 1024, # 100 MB\n      path = tools::R_user_dir(\"gh\", \"cache\")\n    )\n  }\n\n  if (!is_testing()) {\n    req <- httr2::req_retry(\n      req,\n      max_tries = 3,\n      is_transient = function(resp) github_is_transient(resp, x$max_wait),\n      after = github_after\n    )\n  }\n\n  if (!is.null(x$max_rate)) {\n    req <- httr2::req_throttle(req, x$max_rate)\n  }\n\n  # allow custom handling with gh_error\n  req <- httr2::req_error(req, is_error = function(resp) FALSE)\n\n  resp <- httr2::req_perform(req, path = x$desttmp)\n  if (httr2::resp_status(resp) >= 400) {\n    gh_error(resp, gh_req = x, error_call = error_call)\n  }\n\n  resp\n}\n\n# https://docs.github.com/v3/#client-errors\ngh_error <- function(response, gh_req, error_call = caller_env()) {\n  heads <- httr2::resp_headers(response)\n  res <- httr2::resp_body_json(response)\n  status <- httr2::resp_status(response)\n  if (!is.null(gh_req$desttmp)) unlink(gh_req$desttmp)\n\n  msg <- \"GitHub API error ({status}): {heads$status %||% ''} {res$message}\"\n\n  if (status == 404) {\n    msg <- c(msg, x = c(\"URL not found: {.url {response$url}}\"))\n  }\n\n  doc_url <- res$documentation_url\n  if (!is.null(doc_url)) {\n    msg <- c(msg, c(\"i\" = \"Read more at {.url {doc_url}}\"))\n  }\n\n  errors <- res$errors\n  if (!is.null(errors)) {\n    errors <- as.data.frame(do.call(rbind, errors))\n    nms <- c(\"resource\", \"field\", \"code\", \"message\")\n    nms <- nms[nms %in% names(errors)]\n    msg <- c(\n      msg,\n      capture.output(print(errors[nms], row.names = FALSE))\n    )\n  }\n\n  cli::cli_abort(\n    msg,\n    class = c(\"github_error\", paste0(\"http_error_\", status)),\n    call = error_call,\n    response_headers = heads,\n    response_content = res\n  )\n}\n\n\n# use retry-after info when possible\n# https://docs.github.com/en/rest/overview/resources-in-the-rest-api#exceeding-the-rate-limit\ngithub_is_transient <- function(resp, max_wait) {\n  if (httr2::resp_status(resp) != 403) {\n    return(FALSE)\n  }\n  if (!identical(httr2::resp_header(resp, \"x-ratelimit-remaining\"), \"0\")) {\n    return(FALSE)\n  }\n\n  time <- httr2::resp_header(resp, \"x-ratelimit-reset\")\n  if (is.null(time)) {\n    return(FALSE)\n  }\n\n  time <- as.numeric(time)\n  minutes_to_wait <- (time - unclass(Sys.time()))\n  minutes_to_wait <= max_wait\n}\ngithub_after <- function(resp) {\n  time <- as.numeric(httr2::resp_header(resp, \"x-ratelimit-reset\"))\n  time - unclass(Sys.time())\n}\n"
  },
  {
    "path": "R/gh_gql.R",
    "content": "#' A simple interface for the GitHub GraphQL API v4.\n#'\n#' See more about the GraphQL API here:\n#' <https://docs.github.com/graphql>\n#'\n#' Note: pagination and the `.limit` argument does not work currently,\n#' as pagination in the GraphQL API is different from the v3 API.\n#' If you need pagination with GraphQL, you'll need to do that manually.\n#'\n#' @inheritParams gh\n#' @param query The GraphQL query, as a string.\n#' @export\n#' @seealso [gh()] for the GitHub v3 API.\n#' @examplesIf FALSE\n#' gh_gql(\"query { viewer { login }}\")\n#'\n#' # Get rate limit\n#' ratelimit_query <- \"query {\n#'   viewer {\n#'     login\n#'   }\n#'   rateLimit {\n#'     limit\n#'     cost\n#'     remaining\n#'     resetAt\n#'   }\n#' }\"\n#'\n#' gh_gql(ratelimit_query)\ngh_gql <- function(query, ...) {\n  if (\".limit\" %in% names(list(...))) {\n    stop(\"`.limit` does not work with the GraphQL API\")\n  }\n\n  gh(endpoint = \"POST /graphql\", query = query, ...)\n}\n"
  },
  {
    "path": "R/gh_rate_limit.R",
    "content": "#' Return GitHub user's current rate limits\n#'\n#' @description\n#' `gh_rate_limits()` reports on all rate limits for the authenticated user.\n#' `gh_rate_limit()` reports on rate limits for previous successful request.\n#'\n#' Further details on GitHub's API rate limit policies are available at\n#' <https://docs.github.com/v3/#rate-limiting>.\n#'\n#' @param response `gh_response` object from a previous `gh` call, rate\n#' limit values are determined from values in the response header.\n#' Optional argument, if missing a call to \"GET /rate_limit\" will be made.\n#'\n#' @inheritParams gh\n#'\n#' @return A `list` object containing the overall `limit`, `remaining` limit, and the\n#' limit `reset` time.\n#'\n#' @export\n\ngh_rate_limit <- function(\n  response = NULL,\n  .token = NULL,\n  .api_url = NULL,\n  .send_headers = NULL\n) {\n  if (is.null(response)) {\n    # This end point does not count against limit\n    .token <- .token %||% gh_token(.api_url)\n    response <- gh(\n      \"GET /rate_limit\",\n      .token = .token,\n      .api_url = .api_url,\n      .send_headers = .send_headers\n    )\n  }\n\n  stopifnot(inherits(response, \"gh_response\"))\n\n  http_res <- attr(response, \"response\")\n\n  reset <- as.integer(c(http_res[[\"x-ratelimit-reset\"]], NA)[1])\n  reset <- as.POSIXct(reset, origin = \"1970-01-01\")\n\n  list(\n    limit = as.integer(c(http_res[[\"x-ratelimit-limit\"]], NA)[1]),\n    remaining = as.integer(c(http_res[[\"x-ratelimit-remaining\"]], NA)[1]),\n    reset = reset\n  )\n}\n\n#' @export\n#' @rdname gh_rate_limit\ngh_rate_limits <- function(\n  .token = NULL,\n  .api_url = NULL,\n  .send_headers = NULL\n) {\n  .token <- .token %||% gh_token(.api_url)\n  response <- gh(\n    \"GET /rate_limit\",\n    .token = .token,\n    .api_url = .api_url,\n    .send_headers = .send_headers\n  )\n\n  resources <- response$resources\n\n  reset <- .POSIXct(sapply(resources, \"[[\", \"reset\"))\n\n  data.frame(\n    type = names(resources),\n    limit = sapply(resources, \"[[\", \"limit\"),\n    used = sapply(resources, \"[[\", \"used\"),\n    remaining = sapply(resources, \"[[\", \"remaining\"),\n    reset = reset,\n    mins_left = round((unclass(reset) - unclass(Sys.time())) / 60, 1),\n    stringsAsFactors = FALSE,\n    row.names = NULL\n  )\n}\n"
  },
  {
    "path": "R/gh_request.R",
    "content": "## Main API URL\ndefault_api_url <- function() {\n  Sys.getenv(\"GITHUB_API_URL\", unset = \"https://api.github.com\")\n}\n\n## Headers to send with each API request\ndefault_send_headers <- c(\"User-Agent\" = \"https://github.com/r-lib/gh\")\n\ngh_build_request <- function(\n  endpoint = \"/user\",\n  params = list(),\n  token = NULL,\n  destfile = NULL,\n  overwrite = NULL,\n  accept = NULL,\n  send_headers = NULL,\n  max_wait = 10,\n  max_rate = NULL,\n  api_url = NULL,\n  method = \"GET\"\n) {\n  working <- list(\n    method = method,\n    url = character(),\n    headers = NULL,\n    query = NULL,\n    body = NULL,\n    endpoint = endpoint,\n    params = params,\n    token = token,\n    accept = c(Accept = accept),\n    send_headers = send_headers,\n    api_url = api_url,\n    dest = destfile,\n    overwrite = overwrite,\n    max_wait = max_wait,\n    max_rate = max_rate\n  )\n\n  working <- gh_set_verb(working)\n  working <- gh_set_endpoint(working)\n  working <- gh_set_query(working)\n  working <- gh_set_body(working)\n  working <- gh_set_url(working)\n  working <- gh_set_headers(working)\n  working <- gh_set_temp_destfile(working)\n  working[c(\n    \"method\",\n    \"url\",\n    \"headers\",\n    \"query\",\n    \"body\",\n    \"dest\",\n    \"desttmp\",\n    \"max_wait\",\n    \"max_rate\"\n  )]\n}\n\n\n## gh_set_*(x)\n## x = a list in which we build up an httr2 request\n## x goes in, x comes out, possibly modified\n\ngh_set_verb <- function(x) {\n  if (!nzchar(x$endpoint)) {\n    return(x)\n  }\n\n  # No method defined, so use default\n  if (grepl(\"^/\", x$endpoint) || grepl(\"^http\", x$endpoint)) {\n    return(x)\n  }\n\n  # Method can be lower-case (e.g. copy-pasting from API docs in Firefox)\n  method <- gsub(\"^([^/ ]+)\\\\s+.*$\", \"\\\\1\", x$endpoint)\n  x$endpoint <- gsub(sprintf(\"^%s+ \", method), \"\", x$endpoint)\n  # Now switch method to upper-case\n  x$method <- toupper(method)\n  x\n}\n\ngh_set_endpoint <- function(x) {\n  params <- x$params\n  if (\n    !is_template(x$endpoint) || length(params) == 0L || has_no_names(params)\n  ) {\n    return(x)\n  }\n\n  named_params <- which(has_name(params))\n  done <- rep_len(FALSE, length(params))\n  endpoint <- endpoint2 <- x$endpoint\n\n  for (i in named_params) {\n    endpoint2 <- expand_variable(\n      varname = names(params)[i],\n      value = params[[i]][1],\n      template = endpoint\n    )\n    if (is.na(endpoint2)) {\n      cli::cli_abort(\n        \"Named NA parameters are not allowed: {names(params)[i]}\"\n      )\n    }\n    if (endpoint2 != endpoint) {\n      endpoint <- endpoint2\n      done[i] <- TRUE\n    }\n    if (!is_template(endpoint)) {\n      break\n    }\n  }\n\n  x$endpoint <- endpoint\n  x$params <- x$params[!done]\n  x$params <- cleanse_names(x$params)\n  x\n}\n\ngh_set_query <- function(x) {\n  params <- x$params\n  if (x$method != \"GET\" || length(params) == 0L) {\n    return(x)\n  }\n  stopifnot(all(has_name(params)))\n  x$query <- params\n  x$params <- NULL\n  x\n}\n\ngh_set_body <- function(x) {\n  if (length(x$params) == 0L) {\n    return(x)\n  }\n  if (x$method == \"GET\") {\n    warning(\"This is a 'GET' request and unnamed parameters are being ignored.\")\n    return(x)\n  }\n  if (length(x$params) == 1 && is.raw(x$params[[1]])) {\n    x$body <- x$params[[1]]\n  } else {\n    x$body <- x$params\n  }\n  x\n}\n\ngh_set_url <- function(x) {\n  if (grepl(\"^https?://\", x$endpoint)) {\n    x$url <- URLencode(x$endpoint)\n    x$api_url <- get_baseurl(x$url)\n  } else {\n    x$api_url <- get_apiurl(x$api_url %||% default_api_url())\n    x$url <- URLencode(paste0(x$api_url, x$endpoint))\n  }\n\n  x\n}\n\ngh_set_temp_destfile <- function(working) {\n  working$desttmp <- if (is.null(working$dest)) {\n    NULL\n  } else {\n    paste0(working$dest, \"-\", basename(tempfile(\"\")), \".gh-tmp\")\n  }\n  working\n}\n\nget_baseurl <- function(url) {\n  # https://github.uni.edu/api/v3/\n  if (!any(grepl(\"^https?://\", url))) {\n    stop(\"Only works with HTTP(S) protocols\")\n  }\n  prot <- sub(\"^(https?://).*$\", \"\\\\1\", url) # https://\n  rest <- sub(\"^https?://(.*)$\", \"\\\\1\", url) #         github.uni.edu/api/v3/\n  host <- sub(\"/.*$\", \"\", rest) #         github.uni.edu\n  paste0(prot, host) # https://github.uni.edu\n}\n\n# https://api.github.com --> https://github.com\n# api.github.com --> github.com\nnormalize_host <- function(x) {\n  sub(\"api[.]github[.]com\", \"github.com\", x)\n}\n\nget_hosturl <- function(url) {\n  url <- get_baseurl(url)\n  normalize_host(url)\n}\n\n# (almost) the inverse of get_hosturl()\n# https://github.com     --> https://api.github.com\n# https://github.uni.edu --> https://github.uni.edu/api/v3\nget_apiurl <- function(url) {\n  host_url <- get_hosturl(url)\n  prot_host <- strsplit(host_url, \"://\", fixed = TRUE)[[1]]\n  if (is_github_dot_com(host_url)) {\n    paste0(prot_host[[1]], \"://api.github.com\")\n  } else {\n    paste0(host_url, \"/api/v3\")\n  }\n}\n\nis_github_dot_com <- function(url) {\n  url <- get_baseurl(url)\n  url <- normalize_host(url)\n  grepl(\"^https?://github.com\", url)\n}\n\ngh_set_headers <- function(x) {\n  # x$api_url must be set properly at this point\n  auth <- gh_auth(x$token %||% gh_token(x$api_url))\n  send_headers <- gh_send_headers(x$accept, x$send_headers)\n  x$headers <- c(send_headers, auth)\n  x\n}\n\ngh_send_headers <- function(accept_header = NULL, headers = NULL) {\n  modify_vector(\n    modify_vector(default_send_headers, accept_header),\n    headers\n  )\n}\n\n# helpers ----\n# https://tools.ietf.org/html/rfc6570\n# we support what the RFC calls \"Level 1 templates\", which only require\n# simple string expansion of a placeholder consisting of [A-Za-z0-9_]\nis_template <- function(x) {\n  is_colon_template(x) || is_uri_template(x)\n}\n\nis_colon_template <- function(x) grepl(\":\", x)\n\nis_uri_template <- function(x) grepl(\"[{]\\\\w+?[}]\", x)\n\ntemplate_type <- function(x) {\n  if (is_uri_template(x)) {\n    return(\"uri\")\n  }\n  if (is_colon_template(x)) {\n    return(\"colon\")\n  }\n}\n\nexpand_variable <- function(varname, value, template) {\n  type <- template_type(template)\n  if (is.null(type)) {\n    return(template)\n  }\n  pattern <- switch(\n    type,\n    uri = paste0(\"[{]\", varname, \"[}]\"),\n    colon = paste0(\":\", varname, \"\\\\b\"),\n    stop(\"Internal error: unrecognized template type\")\n  )\n  gsub(pattern, value, template)\n}\n"
  },
  {
    "path": "R/gh_response.R",
    "content": "gh_process_response <- function(resp, gh_req) {\n  stopifnot(inherits(resp, \"httr2_response\"))\n\n  content_type <- httr2::resp_content_type(resp)\n  gh_media_type <- httr2::resp_header(resp, \"x-github-media-type\")\n\n  is_raw <- identical(content_type, \"application/octet-stream\") ||\n    isTRUE(grepl(\"param=raw$\", gh_media_type, ignore.case = TRUE))\n  is_ondisk <- inherits(resp$body, \"httr2_path\") && !is.null(gh_req$dest)\n  is_empty <- length(resp$body) == 0\n\n  if (is_ondisk) {\n    res <- as.character(resp$body)\n    file.rename(res, gh_req$dest)\n    res <- gh_req$dest\n  } else if (is_empty) {\n    res <- list()\n  } else if (grepl(\"^application/json\", content_type, ignore.case = TRUE)) {\n    res <- httr2::resp_body_json(resp)\n  } else if (is_raw) {\n    res <- httr2::resp_body_raw(resp)\n  } else {\n    if (grepl(\"^text/html\", content_type, ignore.case = TRUE)) {\n      warning(\"Response came back as html :(\", call. = FALSE)\n    }\n    res <- list(message = httr2::resp_body_string(resp))\n  }\n\n  attr(res, \"response\") <- httr2::resp_headers(resp)\n  attr(res, \"request\") <- remove_headers(gh_req)\n\n  if (is_ondisk) {\n    class(res) <- c(\"gh_response\", \"path\")\n  } else if (is_raw) {\n    class(res) <- c(\"gh_response\", \"raw\")\n  } else {\n    class(res) <- c(\"gh_response\", \"list\")\n  }\n  res\n}\n\nremove_headers <- function(x) {\n  x[names(x) != \"headers\"]\n}\n\n# Add vctrs methods that strip attributes from gh_response when combining,\n# enabling rectangling via unnesting etc\n# See <https://github.com/r-lib/gh/issues/161> for more details\n#' @exportS3Method vctrs::vec_ptype2\nvec_ptype2.gh_response.gh_response <- function(x, y, ...) {\n  list()\n}\n\n#' @exportS3Method vctrs::vec_cast\nvec_cast.list.gh_response <- function(x, to, ...) {\n  attributes(x) <- NULL\n  x\n}\n"
  },
  {
    "path": "R/gh_token.R",
    "content": "#' Return the local user's GitHub Personal Access Token (PAT)\n#'\n#' @description\n#' If gh can find a personal access token (PAT) via `gh_token()`, it includes\n#' the PAT in its requests. Some requests succeed without a PAT, but many\n#' require a PAT to prove the request is authorized by a specific GitHub user. A\n#' PAT also helps with rate limiting. If your gh use is more than casual, you\n#' want a PAT.\n#'\n#' gh calls [gitcreds::gitcreds_get()] with the `api_url`, which checks session\n#' environment variables (`GITHUB_PAT`, `GITHUB_TOKEN`)\n#' and then the local Git credential store for a PAT\n#' appropriate to the `api_url`. Therefore, if you have previously used a PAT\n#' with, e.g., command line Git, gh may retrieve and re-use it. You can call\n#' [gitcreds::gitcreds_get()] directly, yourself, if you want to see what is\n#' found for a specific URL. If no matching PAT is found,\n#' [gitcreds::gitcreds_get()] errors, whereas `gh_token()` does not and,\n#' instead, returns `\"\"`.\n#'\n#' See GitHub's documentation on [Creating a personal access\n#' token](https://docs.github.com/en/github/authenticating-to-github/creating-a-personal-access-token),\n#' or use `usethis::create_github_token()` for a guided experience, including\n#' pre-selection of recommended scopes. Once you have a PAT, you can use\n#' [gitcreds::gitcreds_set()] to add it to the Git credential store. From that\n#' point on, gh (via [gitcreds::gitcreds_get()]) should be able to find it\n#' without further effort on your part.\n#'\n#' @param api_url GitHub API URL. Defaults to the `GITHUB_API_URL` environment\n#'   variable, if set, and otherwise to <https://api.github.com>.\n#'\n#' @return A string of characters, if a PAT is found, or the empty\n#'   string, otherwise. For convenience, the return value has an S3 class in\n#'   order to ensure that simple printing strategies don't reveal the entire\n#'   PAT.\n#'\n#' @export\n#'\n#' @examples\n#' \\dontrun{\n#' gh_token()\n#'\n#' format(gh_token())\n#'\n#' str(gh_token())\n#' }\ngh_token <- function(api_url = NULL) {\n  api_url <- api_url %||% default_api_url()\n  stopifnot(is.character(api_url), length(api_url) == 1)\n  host_url <- get_hosturl(api_url)\n  # Check for credentials supplied by Posit Connect.\n  if (is_installed(\"connectcreds\")) {\n    if (connectcreds::has_viewer_token(host_url)) {\n      token <- connectcreds::connect_viewer_token(host_url)\n      return(gh_pat(token$access_token))\n    }\n  }\n  token <- tryCatch(\n    gitcreds::gitcreds_get(host_url),\n    error = function(e) NULL\n  )\n  gh_pat(token$password %||% \"\")\n}\n\n#' @export\n#' @rdname gh_token\ngh_token_exists <- function(api_url = NULL) {\n  tryCatch(nzchar(gh_token(api_url)), error = function(e) FALSE)\n}\n\ngh_auth <- function(token) {\n  if (isTRUE(token != \"\")) {\n    if (any(grepl(\"\\\\W\", token))) {\n      warning(\"Token contains whitespace characters\")\n    }\n    c(\"Authorization\" = paste(\"token\", trim_ws(token)))\n  } else {\n    character()\n  }\n}\n\n# gh_pat class: exists in order have a print method that hides info ----\nnew_gh_pat <- function(x) {\n  if (is.character(x) && length(x) == 1) {\n    structure(x, class = \"gh_pat\")\n  } else {\n    cli::cli_abort(\"A GitHub PAT must be a string\")\n  }\n}\n\n# validates PAT only in a very narrow, technical, and local sense\nvalidate_gh_pat <- function(x) {\n  stopifnot(inherits(x, \"gh_pat\"))\n  if (\n    x == \"\" ||\n      # https://github.blog/changelog/2021-03-04-authentication-token-format-updates/\n      # Fine grained tokens start with \"github_pat_\".\n      # https://github.blog/changelog/2022-10-18-introducing-fine-grained-personal-access-tokens/\n      grepl(\n        \"^(gh[pousr]_[A-Za-z0-9_]{36,251}|github_pat_[A-Za-z0-9_]{36,244})$\",\n        x\n      ) ||\n      grepl(\"^[[:xdigit:]]{40}$\", x)\n  ) {\n    x\n  } else {\n    url <- \"https://gh.r-lib.org/articles/managing-personal-access-tokens.html\"\n    cli::cli_abort(c(\n      \"Invalid GitHub PAT format\",\n      \"i\" = \"A GitHub PAT must have one of three forms:\",\n      \"*\" = \"40 hexadecimal digits (older PATs)\",\n      \"*\" = \"A 'ghp_' prefix followed by 36 to 251 more characters (newer PATs)\",\n      \"*\" = \"A 'github_pat_' prefix followed by 36 to 244 more characters (fine-grained PATs)\",\n      \"i\" = \"Read more at {.url {url}}.\"\n    ))\n  }\n}\n\ngh_pat <- function(x) {\n  validate_gh_pat(new_gh_pat(x))\n}\n\n#' @export\nformat.gh_pat <- function(x, ...) {\n  if (x == \"\") {\n    \"<no PAT>\"\n  } else {\n    obfuscate(x)\n  }\n}\n\n#' @export\nprint.gh_pat <- function(x, ...) {\n  cat(format(x), sep = \"\\n\")\n  invisible(x)\n}\n\n#' @export\nstr.gh_pat <- function(object, ...) {\n  cat(paste0(\"<gh_pat> \", format(object), \"\\n\", collapse = \"\"))\n  invisible()\n}\n\nobfuscate <- function(x, first = 4, last = 4) {\n  paste0(\n    substr(x, start = 1, stop = first),\n    \"...\",\n    substr(x, start = nchar(x) - last + 1, stop = nchar(x))\n  )\n}\n"
  },
  {
    "path": "R/gh_whoami.R",
    "content": "#' Info on current GitHub user and token\n#'\n#' Reports wallet name, GitHub login, and GitHub URL for the current\n#' authenticated user, the first bit of the token, and the associated scopes.\n#'\n#' Get a personal access token for the GitHub API from\n#' <https://github.com/settings/tokens> and select the scopes necessary for your\n#' planned tasks. The `repo` scope, for example, is one many are likely to need.\n#'\n#' On macOS and Windows it is best to store the token in the git credential\n#' store, where most GitHub clients, including gh, can access it. You can\n#' use the gitcreds package to add your token to the credential store:\n#'\n#' ```r\n#' gitcreds::gitcreds_set()\n#' ```\n#'\n#' See <https://gh.r-lib.org/articles/managing-personal-access-tokens.html>\n#' and <https://usethis.r-lib.org/articles/articles/git-credentials.html>\n#' for more about managing GitHub (and generic git) credentials.\n#'\n#' On other systems, including Linux, the git credential store is\n#' typically not as convenient, and you might want to store your token in\n#' the `GITHUB_PAT` environment variable, which you can set in your\n#' `.Renviron` file.\n#'\n#' @inheritParams gh\n#'\n#' @return A `gh_response` object, which is also a `list`.\n#' @export\n#'\n#' @examplesIf identical(Sys.getenv(\"IN_PKGDOWN\"), \"true\")\n#' gh_whoami()\n#' @examplesIf FALSE\n#' ## explicit token + use with GitHub Enterprise\n#' gh_whoami(\n#'   .token = \"8c70fd8419398999c9ac5bacf3192882193cadf2\",\n#'   .api_url = \"https://github.foobar.edu/api/v3\"\n#' )\ngh_whoami <- function(.token = NULL, .api_url = NULL, .send_headers = NULL) {\n  .token <- .token %||% gh_token(.api_url)\n  if (isTRUE(.token == \"\")) {\n    message(\n      \"No personal access token (PAT) available.\\n\",\n      \"Obtain a PAT from here:\\n\",\n      \"https://github.com/settings/tokens\\n\",\n      \"For more on what to do with the PAT, see ?gh_whoami.\"\n    )\n    return(invisible(NULL))\n  }\n  res <- gh(\n    endpoint = \"/user\",\n    .token = .token,\n    .api_url = .api_url,\n    .send_headers = .send_headers\n  )\n  scopes <- attr(res, \"response\")[[\"x-oauth-scopes\"]]\n  res <- res[c(\"name\", \"login\", \"html_url\")]\n  res$scopes <- scopes\n  res$token <- format(gh_pat(.token))\n  ## 'gh_response' class has to be restored\n  class(res) <- c(\"gh_response\", \"list\")\n  res\n}\n"
  },
  {
    "path": "R/git.R",
    "content": "#' Find the GitHub remote associated with a path\n#'\n#' This is handy helper if you want to make gh requests related to the\n#' current project.\n#'\n#' @param path Path that is contained within a git repo.\n#' @return If the repo has a github remote, a list containing `username`\n#'    and `repo`. Otherwise, an error.\n#' @export\n#' @examplesIf interactive()\n#' gh_tree_remote()\ngh_tree_remote <- function(path = \".\") {\n  github_remote(git_remotes(path), path)\n}\n\ngithub_remote <- function(x, path) {\n  remotes <- lapply(x, github_remote_parse)\n  remotes <- remotes[!vapply(remotes, is.null, logical(1))]\n\n  if (length(remotes) == 0) {\n    cli::cli_abort(\"No GitHub remotes found at {.path {path}}\")\n  }\n\n  if (length(remotes) > 1) {\n    if (any(names(remotes) == \"origin\")) {\n      warning(\"Multiple github remotes found. Using origin.\", call. = FALSE)\n      remotes <- remotes[[\"origin\"]]\n    } else {\n      warning(\"Multiple github remotes found. Using first.\", call. = FALSE)\n      remotes <- remotes[[1]]\n    }\n  } else {\n    remotes[[1]]\n  }\n}\n\ngithub_remote_parse <- function(x) {\n  if (length(x) == 0) {\n    return(NULL)\n  }\n  if (!grepl(\"github\", x)) {\n    return(NULL)\n  }\n\n  # https://github.com/hadley/devtools.git\n  # https://github.com/hadley/devtools\n  # git@github.com:hadley/devtools.git\n  re <- \"github[^/:]*[/:]([^/]+)/(.*?)(?:\\\\.git)?$\"\n  m <- regexec(re, x)\n  match <- regmatches(x, m)[[1]]\n\n  if (length(match) == 0) {\n    return(NULL)\n  }\n\n  list(\n    username = match[2],\n    repo = match[3]\n  )\n}\n\ngit_remotes <- function(path = \".\") {\n  conf <- git_config(path)\n  remotes <- conf[grepl(\"^remote\", names(conf))]\n\n  remotes <- discard(remotes, function(x) is.null(x$url))\n  urls <- vapply(remotes, \"[[\", \"url\", FUN.VALUE = character(1))\n\n  names(urls) <- gsub('^remote \"(.*?)\"$', \"\\\\1\", names(remotes))\n  urls\n}\n\n\ngit_config <- function(path = \".\") {\n  config_path <- file.path(repo_root(path), \".git\", \"config\")\n  if (!file.exists(config_path)) {\n    cli::cli_abort(\"git config does not exist at {.path {path}}\")\n  }\n  ini::read.ini(config_path, \"UTF-8\")\n}\n\nrepo_root <- function(path = \".\") {\n  if (!file.exists(path)) {\n    cli::cli_abort(\"Can't find repo at {.path {path}}\")\n  }\n\n  # Walk up to root directory\n  while (!has_git(path)) {\n    if (is_root(path)) {\n      cli::cli_abort(\"Could not find git root from {.path {path}}.\")\n    }\n\n    path <- dirname(path)\n  }\n\n  path\n}\n\nhas_git <- function(path) {\n  file.exists(file.path(path, \".git\"))\n}\n\nis_root <- function(path) {\n  identical(path, dirname(path))\n}\n"
  },
  {
    "path": "R/import-standalone-purrr.R",
    "content": "# Standalone file: do not edit by hand\n# Source: <https://github.com/r-lib/rlang/blob/main/R/standalone-purrr.R>\n# ----------------------------------------------------------------------\n#\n# ---\n# repo: r-lib/rlang\n# file: standalone-purrr.R\n# last-updated: 2023-02-23\n# license: https://unlicense.org\n# imports: rlang\n# ---\n#\n# This file provides a minimal shim to provide a purrr-like API on top of\n# base R functions. They are not drop-in replacements but allow a similar style\n# of programming.\n#\n# ## Changelog\n#\n# 2023-02-23:\n# * Added `list_c()`\n#\n# 2022-06-07:\n# * `transpose()` is now more consistent with purrr when inner names\n#   are not congruent (#1346).\n#\n# 2021-12-15:\n# * `transpose()` now supports empty lists.\n#\n# 2021-05-21:\n# * Fixed \"object `x` not found\" error in `imap()` (@mgirlich)\n#\n# 2020-04-14:\n# * Removed `pluck*()` functions\n# * Removed `*_cpl()` functions\n# * Used `as_function()` to allow use of `~`\n# * Used `.` prefix for helpers\n#\n# nocov start\n\nmap <- function(.x, .f, ...) {\n  .f <- as_function(.f, env = global_env())\n  lapply(.x, .f, ...)\n}\nwalk <- function(.x, .f, ...) {\n  map(.x, .f, ...)\n  invisible(.x)\n}\n\nmap_lgl <- function(.x, .f, ...) {\n  .rlang_purrr_map_mold(.x, .f, logical(1), ...)\n}\nmap_int <- function(.x, .f, ...) {\n  .rlang_purrr_map_mold(.x, .f, integer(1), ...)\n}\nmap_dbl <- function(.x, .f, ...) {\n  .rlang_purrr_map_mold(.x, .f, double(1), ...)\n}\nmap_chr <- function(.x, .f, ...) {\n  .rlang_purrr_map_mold(.x, .f, character(1), ...)\n}\n.rlang_purrr_map_mold <- function(.x, .f, .mold, ...) {\n  .f <- as_function(.f, env = global_env())\n  out <- vapply(.x, .f, .mold, ..., USE.NAMES = FALSE)\n  names(out) <- names(.x)\n  out\n}\n\nmap2 <- function(.x, .y, .f, ...) {\n  .f <- as_function(.f, env = global_env())\n  out <- mapply(.f, .x, .y, MoreArgs = list(...), SIMPLIFY = FALSE)\n  if (length(out) == length(.x)) {\n    set_names(out, names(.x))\n  } else {\n    set_names(out, NULL)\n  }\n}\nmap2_lgl <- function(.x, .y, .f, ...) {\n  as.vector(map2(.x, .y, .f, ...), \"logical\")\n}\nmap2_int <- function(.x, .y, .f, ...) {\n  as.vector(map2(.x, .y, .f, ...), \"integer\")\n}\nmap2_dbl <- function(.x, .y, .f, ...) {\n  as.vector(map2(.x, .y, .f, ...), \"double\")\n}\nmap2_chr <- function(.x, .y, .f, ...) {\n  as.vector(map2(.x, .y, .f, ...), \"character\")\n}\nimap <- function(.x, .f, ...) {\n  map2(.x, names(.x) %||% seq_along(.x), .f, ...)\n}\n\npmap <- function(.l, .f, ...) {\n  .f <- as.function(.f)\n  args <- .rlang_purrr_args_recycle(.l)\n  do.call(\"mapply\", c(\n    FUN = list(quote(.f)),\n    args, MoreArgs = quote(list(...)),\n    SIMPLIFY = FALSE, USE.NAMES = FALSE\n  ))\n}\n.rlang_purrr_args_recycle <- function(args) {\n  lengths <- map_int(args, length)\n  n <- max(lengths)\n\n  stopifnot(all(lengths == 1L | lengths == n))\n  to_recycle <- lengths == 1L\n  args[to_recycle] <- map(args[to_recycle], function(x) rep.int(x, n))\n\n  args\n}\n\nkeep <- function(.x, .f, ...) {\n  .x[.rlang_purrr_probe(.x, .f, ...)]\n}\ndiscard <- function(.x, .p, ...) {\n  sel <- .rlang_purrr_probe(.x, .p, ...)\n  .x[is.na(sel) | !sel]\n}\nmap_if <- function(.x, .p, .f, ...) {\n  matches <- .rlang_purrr_probe(.x, .p)\n  .x[matches] <- map(.x[matches], .f, ...)\n  .x\n}\n.rlang_purrr_probe <- function(.x, .p, ...) {\n  if (is_logical(.p)) {\n    stopifnot(length(.p) == length(.x))\n    .p\n  } else {\n    .p <- as_function(.p, env = global_env())\n    map_lgl(.x, .p, ...)\n  }\n}\n\ncompact <- function(.x) {\n  Filter(length, .x)\n}\n\ntranspose <- function(.l) {\n  if (!length(.l)) {\n    return(.l)\n  }\n\n  inner_names <- names(.l[[1]])\n\n  if (is.null(inner_names)) {\n    fields <- seq_along(.l[[1]])\n  } else {\n    fields <- set_names(inner_names)\n    .l <- map(.l, function(x) {\n      if (is.null(names(x))) {\n        set_names(x, inner_names)\n      } else {\n        x\n      }\n    })\n  }\n\n  # This way missing fields are subsetted as `NULL` instead of causing\n  # an error\n  .l <- map(.l, as.list)\n\n  map(fields, function(i) {\n    map(.l, .subset2, i)\n  })\n}\n\nevery <- function(.x, .p, ...) {\n  .p <- as_function(.p, env = global_env())\n\n  for (i in seq_along(.x)) {\n    if (!rlang::is_true(.p(.x[[i]], ...))) return(FALSE)\n  }\n  TRUE\n}\nsome <- function(.x, .p, ...) {\n  .p <- as_function(.p, env = global_env())\n\n  for (i in seq_along(.x)) {\n    if (rlang::is_true(.p(.x[[i]], ...))) return(TRUE)\n  }\n  FALSE\n}\nnegate <- function(.p) {\n  .p <- as_function(.p, env = global_env())\n  function(...) !.p(...)\n}\n\nreduce <- function(.x, .f, ..., .init) {\n  f <- function(x, y) .f(x, y, ...)\n  Reduce(f, .x, init = .init)\n}\nreduce_right <- function(.x, .f, ..., .init) {\n  f <- function(x, y) .f(y, x, ...)\n  Reduce(f, .x, init = .init, right = TRUE)\n}\naccumulate <- function(.x, .f, ..., .init) {\n  f <- function(x, y) .f(x, y, ...)\n  Reduce(f, .x, init = .init, accumulate = TRUE)\n}\naccumulate_right <- function(.x, .f, ..., .init) {\n  f <- function(x, y) .f(y, x, ...)\n  Reduce(f, .x, init = .init, right = TRUE, accumulate = TRUE)\n}\n\ndetect <- function(.x, .f, ..., .right = FALSE, .p = is_true) {\n  .p <- as_function(.p, env = global_env())\n  .f <- as_function(.f, env = global_env())\n\n  for (i in .rlang_purrr_index(.x, .right)) {\n    if (.p(.f(.x[[i]], ...))) {\n      return(.x[[i]])\n    }\n  }\n  NULL\n}\ndetect_index <- function(.x, .f, ..., .right = FALSE, .p = is_true) {\n  .p <- as_function(.p, env = global_env())\n  .f <- as_function(.f, env = global_env())\n\n  for (i in .rlang_purrr_index(.x, .right)) {\n    if (.p(.f(.x[[i]], ...))) {\n      return(i)\n    }\n  }\n  0L\n}\n.rlang_purrr_index <- function(x, right = FALSE) {\n  idx <- seq_along(x)\n  if (right) {\n    idx <- rev(idx)\n  }\n  idx\n}\n\nlist_c <- function(x) {\n  inject(c(!!!x))\n}\n\n# nocov end\n"
  },
  {
    "path": "R/pagination.R",
    "content": "extract_link <- function(gh_response, link) {\n  headers <- attr(gh_response, \"response\")\n  links <- headers$link\n  if (is.null(links)) {\n    return(NA_character_)\n  }\n  links <- trim_ws(strsplit(links, \",\")[[1]])\n  link_list <- lapply(links, function(x) {\n    x <- trim_ws(strsplit(x, \";\")[[1]])\n    name <- sub(\"^.*\\\"(.*)\\\".*$\", \"\\\\1\", x[2])\n    value <- sub(\"^<(.*)>$\", \"\\\\1\", x[1])\n    c(name, value)\n  })\n  link_list <- structure(\n    vapply(link_list, \"[\", \"\", 2),\n    names = vapply(link_list, \"[\", \"\", 1)\n  )\n\n  if (link %in% names(link_list)) {\n    link_list[[link]]\n  } else {\n    NA_character_\n  }\n}\n\ngh_has <- function(gh_response, link) {\n  url <- extract_link(gh_response, link)\n  !is.na(url)\n}\n\ngh_has_next <- function(gh_response) {\n  gh_has(gh_response, \"next\")\n}\n\ngh_link_request <- function(gh_response, link, .token, .send_headers) {\n  stopifnot(inherits(gh_response, \"gh_response\"))\n\n  url <- extract_link(gh_response, link)\n  if (is.na(url)) cli::cli_abort(\"No {link} page\")\n\n  req <- attr(gh_response, \"request\")\n  req$url <- url\n  req$token <- .token\n  req$send_headers <- .send_headers\n  req <- gh_set_headers(req)\n  req\n}\n\ngh_link <- function(gh_response, link, .token, .send_headers) {\n  req <- gh_link_request(gh_response, link, .token, .send_headers)\n  raw <- gh_make_request(req)\n  gh_process_response(raw, req)\n}\n\ngh_extract_pages <- function(gh_response) {\n  last <- extract_link(gh_response, \"last\")\n  if (!is.na(last)) {\n    as.integer(httr2::url_parse(last)$query$page)\n  } else {\n    NA\n  }\n}\n\n#' Get the next, previous, first or last page of results\n#'\n#' @details\n#' Note that these are not always defined. E.g. if the first\n#' page was queried (the default), then there are no first and previous\n#' pages defined. If there is no next page, then there is no\n#' next page defined, etc.\n#'\n#' If the requested page does not exist, an error is thrown.\n#'\n#' @param gh_response An object returned by a [gh()] call.\n#' @inheritParams gh\n#' @return Answer from the API.\n#'\n#' @seealso The `.limit` argument to [gh()] supports fetching more than\n#'   one page.\n#'\n#' @name gh_next\n#' @export\n#' @examplesIf identical(Sys.getenv(\"IN_PKGDOWN\"), \"true\")\n#' x <- gh(\"/users\")\n#' vapply(x, \"[[\", character(1), \"login\")\n#' x2 <- gh_next(x)\n#' vapply(x2, \"[[\", character(1), \"login\")\ngh_next <- function(gh_response, .token = NULL, .send_headers = NULL) {\n  gh_link(gh_response, \"next\", .token = .token, .send_headers = .send_headers)\n}\n\n#' @name gh_next\n#' @export\n\ngh_prev <- function(gh_response, .token = NULL, .send_headers = NULL) {\n  gh_link(gh_response, \"prev\", .token = .token, .send_headers = .send_headers)\n}\n\n#' @name gh_next\n#' @export\n\ngh_first <- function(gh_response, .token = NULL, .send_headers = NULL) {\n  gh_link(gh_response, \"first\", .token = .token, .send_headers = .send_headers)\n}\n\n#' @name gh_next\n#' @export\n\ngh_last <- function(gh_response, .token = NULL, .send_headers = NULL) {\n  gh_link(gh_response, \"last\", .token = .token, .send_headers = .send_headers)\n}\n"
  },
  {
    "path": "R/print.R",
    "content": "#' Print the result of a GitHub API call\n#'\n#' @param x The result object.\n#' @param ... Ignored.\n#' @return The JSON result.\n#'\n#' @importFrom jsonlite prettify toJSON\n#' @export\n#' @method print gh_response\n\nprint.gh_response <- function(x, ...) {\n  if (inherits(x, c(\"raw\", \"path\"))) {\n    attributes(x) <- list(class = class(x))\n    print.default(x)\n  } else {\n    print(toJSON(unclass(x), pretty = TRUE, auto_unbox = TRUE, force = TRUE))\n  }\n}\n"
  },
  {
    "path": "R/utils.R",
    "content": "trim_ws <- function(x) {\n  sub(\"\\\\s*$\", \"\", sub(\"^\\\\s*\", \"\", x))\n}\n\n## from devtools, among other places\ncompact <- function(x) {\n  is_empty <- vapply(x, function(x) length(x) == 0, logical(1))\n  x[!is_empty]\n}\n\n## from purrr, among other places\n`%||%` <- function(x, y) {\n  if (is.null(x)) {\n    y\n  } else {\n    x\n  }\n}\n\n## as seen in purrr, with the name `has_names()`\nhas_name <- function(x) {\n  nms <- names(x)\n  if (is.null(nms)) {\n    rep_len(FALSE, length(x))\n  } else {\n    !(is.na(nms) | nms == \"\")\n  }\n}\n\nhas_no_names <- function(x) all(!has_name(x))\n\n## if all names are \"\", strip completely\ncleanse_names <- function(x) {\n  if (has_no_names(x)) {\n    names(x) <- NULL\n  }\n  x\n}\n\n## to process HTTP headers, i.e. combine defaults w/ user-specified headers\n## in the spirit of modifyList(), except\n## x and y are vectors (not lists)\n## name comparison is case insensitive\n## http://www.w3.org/Protocols/rfc2616/rfc2616-sec4.html#sec4.2\n## x will be default headers, y will be user-specified\nmodify_vector <- function(x, y = NULL) {\n  if (length(y) == 0L) {\n    return(x)\n  }\n  lnames <- function(x) tolower(names(x))\n  c(x[!(lnames(x) %in% lnames(y))], y)\n}\n\n\ndiscard <- function(.x, .p, ...) {\n  sel <- probe(.x, .p, ...)\n  .x[is.na(sel) | !sel]\n}\nprobe <- function(.x, .p, ...) {\n  if (is.logical(.p)) {\n    stopifnot(length(.p) == length(.x))\n    .p\n  } else {\n    vapply(.x, .p, logical(1), ...)\n  }\n}\n\ndrop_named_nulls <- function(x) {\n  if (has_no_names(x)) {\n    return(x)\n  }\n  named <- has_name(x)\n  null <- vapply(x, is.null, logical(1))\n  cleanse_names(x[!named | !null])\n}\n\n.parse_params <- function(..., .params = list()) {\n  params <- c(list2(...), .params)\n  drop_named_nulls(params)\n}\n\ncheck_named_nas <- function(x) {\n  if (has_no_names(x)) {\n    return(x)\n  }\n  named <- has_name(x)\n  na <- vapply(x, FUN.VALUE = logical(1), function(v) {\n    is.atomic(v) && anyNA(v)\n  })\n  bad <- which(named & na)\n  if (length(bad)) {\n    str <- paste0(\"`\", names(x)[bad], \"`\", collapse = \", \")\n    stop(\"Named NA parameters are not allowed: \", str)\n  }\n}\n\ncan_load <- function(pkg) {\n  isTRUE(requireNamespace(pkg, quietly = TRUE))\n}\n\nis_interactive <- function() {\n  opt <- getOption(\"rlib_interactive\")\n  if (isTRUE(opt)) {\n    TRUE\n  } else if (identical(opt, FALSE)) {\n    FALSE\n  } else if (tolower(getOption(\"knitr.in.progress\", \"false\")) == \"true\") {\n    FALSE\n  } else if (identical(Sys.getenv(\"TESTTHAT\"), \"true\")) {\n    FALSE\n  } else {\n    interactive()\n  }\n}\n\nis_testing <- function() {\n  identical(Sys.getenv(\"TESTTHAT\"), \"true\")\n}\n"
  },
  {
    "path": "README.Rmd",
    "content": "---\noutput: github_document\n---\n\n<!-- README.md is generated from README.Rmd. Please edit that file -->\n\n```{r}\n#| label: setup\n#| include: false\nknitr::opts_chunk$set(\n  collapse = TRUE,\n  comment = \"#>\",\n  fig.path = \"man/figures/README-\",\n  out.width = \"100%\"\n)\n```\n\n# gh\n\n<!-- badges: start -->\n[![R-CMD-check](https://github.com/r-lib/gh/workflows/R-CMD-check/badge.svg)](https://github.com/r-lib/gh/actions)\n[![](https://www.r-pkg.org/badges/version/gh)](https://www.r-pkg.org/pkg/gh)\n[![CRAN Posit mirror downloads](https://cranlogs.r-pkg.org/badges/gh)](https://www.r-pkg.org/pkg/gh)\n[![R-CMD-check](https://github.com/r-lib/gh/actions/workflows/R-CMD-check.yaml/badge.svg)](https://github.com/r-lib/gh/actions/workflows/R-CMD-check.yaml)\n[![Codecov test coverage](https://codecov.io/gh/r-lib/gh/graph/badge.svg)](https://app.codecov.io/gh/r-lib/gh)\n<!-- badges: end -->\n\nMinimalistic client to access GitHub's\n[REST](https://docs.github.com/rest) and [GraphQL](https://docs.github.com/graphql) APIs.\n\n## Installation and setup\n\nInstall the package from CRAN as usual:\n\n```{r}\n#| eval: false\ninstall.packages(\"gh\")\n```\n\nInstall the development version from GitHub:\n\n```{r}\n#| eval: false\npak::pak(\"r-lib/gh\")\n```\n\n### Authentication\n\nThe value returned by `gh::gh_token()` is used as Personal Access Token\n(PAT). A token is needed for some requests, and to help with rate limiting.\ngh can use your regular git credentials in the git credential store, via\nthe gitcreds package. Use `gitcreds::gitcreds_set()` to put a PAT into the\ngit credential store. If you cannot use the credential store, set the\n`GITHUB_PAT` environment variable to your PAT. See the details in the\n`?gh::gh_token` manual page and the manual of the gitcreds package.\n\n### API URL\n\n* The `GITHUB_API_URL` environment variable, if set, is used for the default github api url.\n\n## Usage\n\n```{r}\nlibrary(gh)\n```\n\nUse the `gh()` function to access all API endpoints. The endpoints are\nlisted in the [documentation](https://docs.github.com/rest).\n\nThe first argument of `gh()` is the endpoint. You can just copy and paste the\nAPI endpoints from the documentation. Note that the leading slash\nmust be included as well.\n\nFrom <https://docs.github.com/rest/reference/repos#list-repositories-for-a-user> you can copy and paste `GET /users/{username}/repos` into your `gh()`\ncall. E.g.\n\n```{r}\nmy_repos <- gh(\"GET /users/{username}/repos\", username = \"gaborcsardi\")\nvapply(my_repos, \"[[\", \"\", \"name\")\n```\n\nThe JSON result sent by the API is converted to an R object.\n\nParameters can be passed as extra arguments. E.g.\n\n```{r}\nmy_repos <- gh(\n  \"/users/{username}/repos\",\n  username = \"gaborcsardi\",\n  sort = \"created\")\nvapply(my_repos, \"[[\", \"\", \"name\")\n```\n\n### POST, PATCH, PUT and DELETE requests\n\nPOST, PATCH, PUT, and DELETE requests can be sent by including the\nHTTP verb before the endpoint, in the first argument. E.g. to\ncreate a repository:\n\n```{r}\n#| eval: false\nnew_repo <- gh(\"POST /user/repos\", name = \"my-new-repo-for-gh-testing\")\n```\n\nand then delete it:\n\n```{r}\n#| eval: false\ngh(\"DELETE /repos/{owner}/{repo}\", owner = \"gaborcsardi\",\n   repo = \"my-new-repo-for-gh-testing\")\n```\n\n### Tokens\n\nBy default the `GITHUB_PAT` environment variable is used. Alternatively,\none can set the `.token` argument of `gh()`.\n\n### Pagination\n\nSupply the `page` parameter to get subsequent pages:\n\n```{r}\nmy_repos2 <- gh(\"GET /orgs/{org}/repos\", org = \"r-lib\", page = 2)\nvapply(my_repos2, \"[[\", \"\", \"name\")\n```\n\n## Environment Variables\n\n* The `GITHUB_API_URL` environment variable is used for the default github\n  api url.\n* The `GITHUB_PAT` and `GITHUB_TOKEN` environment variables are used, if\n  set, in this order, as default token. Consider using the git credential\n  store instead, see `?gh::gh_token`.\n\n## Code of Conduct\n\nPlease note that the gh project is released with a\n[Contributor Code of Conduct](https://gh.r-lib.org/CODE_OF_CONDUCT.html).\nBy contributing to this project, you agree to abide by its terms.\n\n## License\n\nMIT © Gábor Csárdi, Jennifer Bryan, Hadley Wickham\n"
  },
  {
    "path": "README.md",
    "content": "\n<!-- README.md is generated from README.Rmd. Please edit that file -->\n\n# gh\n\n<!-- badges: start -->\n\n[![R-CMD-check](https://github.com/r-lib/gh/workflows/R-CMD-check/badge.svg)](https://github.com/r-lib/gh/actions)\n[![](https://www.r-pkg.org/badges/version/gh)](https://www.r-pkg.org/pkg/gh)\n[![CRAN Posit mirror\ndownloads](https://cranlogs.r-pkg.org/badges/gh)](https://www.r-pkg.org/pkg/gh)\n[![R-CMD-check](https://github.com/r-lib/gh/actions/workflows/R-CMD-check.yaml/badge.svg)](https://github.com/r-lib/gh/actions/workflows/R-CMD-check.yaml)\n[![Codecov test\ncoverage](https://codecov.io/gh/r-lib/gh/graph/badge.svg)](https://app.codecov.io/gh/r-lib/gh)\n<!-- badges: end -->\n\nMinimalistic client to access GitHub’s\n[REST](https://docs.github.com/rest) and\n[GraphQL](https://docs.github.com/graphql) APIs.\n\n## Installation and setup\n\nInstall the package from CRAN as usual:\n\n``` r\ninstall.packages(\"gh\")\n```\n\nInstall the development version from GitHub:\n\n``` r\npak::pak(\"r-lib/gh\")\n```\n\n### Authentication\n\nThe value returned by `gh::gh_token()` is used as Personal Access Token\n(PAT). A token is needed for some requests, and to help with rate\nlimiting. gh can use your regular git credentials in the git credential\nstore, via the gitcreds package. Use `gitcreds::gitcreds_set()` to put a\nPAT into the git credential store. If you cannot use the credential\nstore, set the `GITHUB_PAT` environment variable to your PAT. See the\ndetails in the `?gh::gh_token` manual page and the manual of the\ngitcreds package.\n\n### API URL\n\n-   The `GITHUB_API_URL` environment variable, if set, is used for the\n    default github api url.\n\n## Usage\n\n``` r\nlibrary(gh)\n```\n\nUse the `gh()` function to access all API endpoints. The endpoints are\nlisted in the [documentation](https://docs.github.com/rest).\n\nThe first argument of `gh()` is the endpoint. You can just copy and\npaste the API endpoints from the documentation. Note that the leading\nslash must be included as well.\n\nFrom\n<https://docs.github.com/rest/reference/repos#list-repositories-for-a-user>\nyou can copy and paste `GET /users/{username}/repos` into your `gh()`\ncall. E.g.\n\n``` r\nmy_repos <- gh(\"GET /users/{username}/repos\", username = \"gaborcsardi\")\nvapply(my_repos, \"[[\", \"\", \"name\")\n#>  [1] \"after\"                \"alda\"                 \"alexr\"               \n#>  [4] \"all.primer.tutorials\" \"altlist\"              \"anticlust\"           \n#>  [7] \"argufy\"               \"ask\"                  \"async\"               \n#> [10] \"autobrew-bundler\"     \"available-work\"       \"baguette\"            \n#> [13] \"BCEA\"                 \"BH\"                   \"bigrquerystorage\"    \n#> [16] \"brew-big-sur\"         \"brokenPackage\"        \"brulee\"              \n#> [19] \"build-r-app\"          \"butcher\"              \"censored\"            \n#> [22] \"cf-tunnel\"            \"checkinstall\"         \"cli\"                 \n#> [25] \"clock\"                \"comments\"             \"covr\"                \n#> [28] \"covrlabs\"             \"cran-metadata\"        \"csg\"\n```\n\nThe JSON result sent by the API is converted to an R object.\n\nParameters can be passed as extra arguments. E.g.\n\n``` r\nmy_repos <- gh(\n  \"/users/{username}/repos\",\n  username = \"gaborcsardi\",\n  sort = \"created\")\nvapply(my_repos, \"[[\", \"\", \"name\")\n#>  [1] \"phantomjs\"       \"FSA\"             \"greta\"           \"webdriver\"      \n#>  [5] \"clock\"           \"testthat\"        \"jsonlite\"        \"duckdb\"         \n#>  [9] \"duckdb-r\"        \"httpuv\"          \"unwind\"          \"httr2\"          \n#> [13] \"pins-r\"          \"install-figlet\"  \"weird-package\"   \"anticlust\"      \n#> [17] \"nanoparquet-cli\" \"cf-tunnel\"       \"myweek\"          \"figlet\"         \n#> [21] \"evercran\"        \"available-work\"  \"r-shell\"         \"Rcpp\"           \n#> [25] \"openssl\"         \"openbsd-vm\"      \"cran-metadata\"   \"run-r-app\"      \n#> [29] \"build-r-app\"     \"comments\"\n```\n\n### POST, PATCH, PUT and DELETE requests\n\nPOST, PATCH, PUT, and DELETE requests can be sent by including the HTTP\nverb before the endpoint, in the first argument. E.g. to create a\nrepository:\n\n``` r\nnew_repo <- gh(\"POST /user/repos\", name = \"my-new-repo-for-gh-testing\")\n```\n\nand then delete it:\n\n``` r\ngh(\"DELETE /repos/{owner}/{repo}\", owner = \"gaborcsardi\",\n   repo = \"my-new-repo-for-gh-testing\")\n```\n\n### Tokens\n\nBy default the `GITHUB_PAT` environment variable is used. Alternatively,\none can set the `.token` argument of `gh()`.\n\n### Pagination\n\nSupply the `page` parameter to get subsequent pages:\n\n``` r\nmy_repos2 <- gh(\"GET /orgs/{org}/repos\", org = \"r-lib\", page = 2)\nvapply(my_repos2, \"[[\", \"\", \"name\")\n#>  [1] \"desc\"        \"profvis\"     \"sodium\"      \"gargle\"      \"remotes\"    \n#>  [6] \"jose\"        \"backports\"   \"rcmdcheck\"   \"vdiffr\"      \"callr\"      \n#> [11] \"mockery\"     \"here\"        \"revdepcheck\" \"processx\"    \"vctrs\"      \n#> [16] \"debugme\"     \"usethis\"     \"rlang\"       \"pkgload\"     \"httrmock\"   \n#> [21] \"pkgbuild\"    \"prettycode\"  \"roxygen2md\"  \"pkgapi\"      \"zeallot\"    \n#> [26] \"liteq\"       \"keyring\"     \"sloop\"       \"styler\"      \"ansistrings\"\n```\n\n## Environment Variables\n\n-   The `GITHUB_API_URL` environment variable is used for the default\n    github api url.\n-   The `GITHUB_PAT` and `GITHUB_TOKEN` environment variables are used,\n    if set, in this order, as default token. Consider using the git\n    credential store instead, see `?gh::gh_token`.\n\n## Code of Conduct\n\nPlease note that the gh project is released with a [Contributor Code of\nConduct](https://gh.r-lib.org/CODE_OF_CONDUCT.html). By contributing to\nthis project, you agree to abide by its terms.\n\n## License\n\nMIT © Gábor Csárdi, Jennifer Bryan, Hadley Wickham\n"
  },
  {
    "path": "_pkgdown.yml",
    "content": "url: https://gh.r-lib.org\n\ntemplate:\n  package: tidytemplate\n  bootstrap: 5\n  includes:\n    in_header: |\n      <script src=\"https://cdn.jsdelivr.net/gh/posit-dev/supported-by-posit/js/badge.min.js\" data-max-height=\"43\" data-light-bg=\"#666f76\" data-light-fg=\"#f9f9f9\"></script>\n      <script defer data-domain=\"gh.r-lib.org,all.tidyverse.org\" src=\"https://plausible.io/js/plausible.js\"></script>\n\n\ndevelopment:\n  mode: auto\n"
  },
  {
    "path": "air.toml",
    "content": ""
  },
  {
    "path": "codecov.yml",
    "content": "comment: false\n\ncoverage:\n  status:\n    project:\n      default:\n        target: auto\n        threshold: 1%\n        informational: true\n    patch:\n      default:\n        target: auto\n        threshold: 1%\n        informational: true\n"
  },
  {
    "path": "gh.Rproj",
    "content": "Version: 1.0\n\nRestoreWorkspace: No\nSaveWorkspace: No\nAlwaysSaveHistory: Default\n\nEnableCodeIndexing: Yes\nUseSpacesForTab: Yes\nNumSpacesForTab: 2\nEncoding: UTF-8\n\nRnwWeave: knitr\nLaTeX: XeLaTeX\n\nAutoAppendNewline: Yes\nStripTrailingWhitespace: Yes\n\nBuildType: Package\nPackageUseDevtools: Yes\nPackageInstallArgs: --no-multiarch --with-keep.source\nPackageRoxygenize: rd,collate,namespace\n"
  },
  {
    "path": "inst/WORDLIST",
    "content": "CMD\nCodecov\nGithub\nGraphQL\nJSON\nLastPass\nMinimalistic\nPATs\nPBC\nPSA\nROR\nURI\napi\nauth\ndiscoverable\nfunder\ngitcreds\ngithub\nhttr\nkeyring\nmacOS\npre\nprogrammatically\nrepo\nusethis\nwc\n"
  },
  {
    "path": "man/gh-package.Rd",
    "content": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/gh-package.R\n\\docType{package}\n\\name{gh-package}\n\\alias{gh-package}\n\\title{gh: 'GitHub' 'API'}\n\\description{\nMinimal client to access the 'GitHub' 'API'.\n}\n\\seealso{\nUseful links:\n\\itemize{\n  \\item \\url{https://gh.r-lib.org/}\n  \\item \\url{https://github.com/r-lib/gh#readme}\n  \\item Report bugs at \\url{https://github.com/r-lib/gh/issues}\n}\n\n}\n\\author{\n\\strong{Maintainer}: Gábor Csárdi \\email{csardi.gabor@gmail.com} [contributor]\n\nAuthors:\n\\itemize{\n  \\item Jennifer Bryan\n  \\item Hadley Wickham\n}\n\nOther contributors:\n\\itemize{\n  \\item Posit Software, PBC (03wc8by49) [copyright holder, funder]\n}\n\n}\n\\keyword{internal}\n"
  },
  {
    "path": "man/gh.Rd",
    "content": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/gh.R\n\\name{gh}\n\\alias{gh}\n\\title{Query the GitHub API}\n\\usage{\ngh(\n  endpoint,\n  ...,\n  per_page = NULL,\n  .per_page = NULL,\n  .token = NULL,\n  .destfile = NULL,\n  .overwrite = FALSE,\n  .api_url = NULL,\n  .method = \"GET\",\n  .limit = NULL,\n  .accept = \"application/vnd.github.v3+json\",\n  .send_headers = NULL,\n  .progress = TRUE,\n  .params = list(),\n  .max_wait = 600,\n  .max_rate = NULL\n)\n}\n\\arguments{\n\\item{endpoint}{GitHub API endpoint. Must be one of the following forms:\n\\itemize{\n\\item \\verb{METHOD path}, e.g. \\code{GET /rate_limit},\n\\item \\code{path}, e.g. \\verb{/rate_limit},\n\\item \\verb{METHOD url}, e.g. \\verb{GET https://api.github.com/rate_limit},\n\\item \\code{url}, e.g. \\verb{https://api.github.com/rate_limit}.\n}\n\nIf the method is not supplied, will use \\code{.method}, which defaults\nto \\code{\"GET\"}.}\n\n\\item{...}{Name-value pairs giving API parameters. Will be matched into\n\\code{endpoint} placeholders, sent as query parameters in GET requests, and as a\nJSON body of POST requests. If there is only one unnamed parameter, and it\nis a raw vector, then it will not be JSON encoded, but sent as raw data, as\nis. This can be used for example to add assets to releases. Named \\code{NULL}\nvalues are silently dropped. For GET requests, named \\code{NA} values trigger an\nerror. For other methods, named \\code{NA} values are included in the body of the\nrequest, as JSON \\code{null}.}\n\n\\item{per_page, .per_page}{Number of items to return per page. If omitted,\nwill be substituted by \\code{max(.limit, 100)} if \\code{.limit} is set,\notherwise determined by the API (never greater than 100).}\n\n\\item{.token}{Authentication token. Defaults to \\code{\\link[=gh_token]{gh_token()}}.}\n\n\\item{.destfile}{Path to write response to disk. If \\code{NULL} (default),\nresponse will be processed and returned as an object. If path is given,\nresponse will be written to disk in the form sent. gh writes the\nresponse to a temporary file, and renames that file to \\code{.destfile}\nafter the request was successful. The name of the temporary file is\ncreated by adding a \\verb{-<random>.gh-tmp} suffix to it, where \\verb{<random>}\nis an ASCII string with random characters. gh removes the temporary\nfile on error.}\n\n\\item{.overwrite}{If \\code{.destfile} is provided, whether to overwrite an\nexisting file.  Defaults to \\code{FALSE}. If an error happens the original\nfile is kept.}\n\n\\item{.api_url}{Github API url (default: \\url{https://api.github.com}). Used\nif \\code{endpoint} just contains a path. Defaults to \\code{GITHUB_API_URL}\nenvironment variable if set.}\n\n\\item{.method}{HTTP method to use if not explicitly supplied in the\n\\code{endpoint}.}\n\n\\item{.limit}{Number of records to return. This can be used\ninstead of manual pagination. By default it is \\code{NULL},\nwhich means that the defaults of the GitHub API are used.\nYou can set it to a number to request more (or less)\nrecords, and also to \\code{Inf} to request all records.\nNote, that if you request many records, then multiple GitHub\nAPI calls are used to get them, and this can take a potentially\nlong time.}\n\n\\item{.accept}{The value of the \\code{Accept} HTTP header. Defaults to\n\\code{\"application/vnd.github.v3+json\"} . If \\code{Accept} is given in\n\\code{.send_headers}, then that will be used. This parameter can be used to\nprovide a custom media type, in order to access a preview feature of\nthe API.}\n\n\\item{.send_headers}{Named character vector of header field values\n(except \\code{Authorization}, which is handled via \\code{.token}). This can be\nused to override or augment the default \\code{User-Agent} header:\n\\code{\"https://github.com/r-lib/gh\"}.}\n\n\\item{.progress}{Whether to show a progress indicator for calls that\nneed more than one HTTP request.}\n\n\\item{.params}{Additional list of parameters to append to \\code{...}.\nIt is easier to use this than \\code{...} if you have your parameters in\na list already.}\n\n\\item{.max_wait}{Maximum number of seconds to wait if rate limited.\nDefaults to 10 minutes.}\n\n\\item{.max_rate}{Maximum request rate in requests per second. Set\nthis to automatically throttle requests.}\n}\n\\value{\nAnswer from the API as a \\code{gh_response} object, which is also a\n\\code{list}. Failed requests will generate an R error. Requests that\ngenerate a raw response will return a raw vector.\n}\n\\description{\nThis is an extremely minimal client. You need to know the API\nto be able to use this client. All this function does is:\n\\itemize{\n\\item Try to substitute each listed parameter into \\code{endpoint}, using the\n\\code{{parameter}} notation.\n\\item If a GET request (the default), then add all other listed parameters\nas query parameters.\n\\item If not a GET request, then send the other parameters in the request\nbody, as JSON.\n\\item Convert the response to an R list using \\code{\\link[jsonlite:fromJSON]{jsonlite::fromJSON()}}.\n}\n}\n\\examples{\n\\dontshow{if (identical(Sys.getenv(\"IN_PKGDOWN\"), \"true\")) (if (getRversion() >= \"3.4\") withAutoprint else force)(\\{ # examplesIf}\n## Repositories of a user, these are equivalent\ngh(\"/users/hadley/repos\", .limit = 2)\ngh(\"/users/{username}/repos\", username = \"hadley\", .limit = 2)\n\n## Starred repositories of a user\ngh(\"/users/hadley/starred\", .limit = 2)\ngh(\"/users/{username}/starred\", username = \"hadley\", .limit = 2)\n\\dontshow{\\}) # examplesIf}\n\\dontshow{if (FALSE) (if (getRversion() >= \"3.4\") withAutoprint else force)(\\{ # examplesIf}\n## Create a repository, needs a token (see gh_token())\ngh(\"POST /user/repos\", name = \"foobar\")\n\\dontshow{\\}) # examplesIf}\n\\dontshow{if (identical(Sys.getenv(\"IN_PKGDOWN\"), \"true\")) (if (getRversion() >= \"3.4\") withAutoprint else force)(\\{ # examplesIf}\n## Issues of a repository\ngh(\"/repos/hadley/dplyr/issues\")\ngh(\"/repos/{owner}/{repo}/issues\", owner = \"hadley\", repo = \"dplyr\")\n\n## Automatic pagination\nusers <- gh(\"/users\", .limit = 50)\nlength(users)\n\\dontshow{\\}) # examplesIf}\n\\dontshow{if (FALSE) (if (getRversion() >= \"3.4\") withAutoprint else force)(\\{ # examplesIf}\n## Access developer preview of Licenses API (in preview as of 2015-09-24)\ngh(\"/licenses\") # used to error code 415\ngh(\"/licenses\", .accept = \"application/vnd.github.drax-preview+json\")\n\\dontshow{\\}) # examplesIf}\n\\dontshow{if (FALSE) (if (getRversion() >= \"3.4\") withAutoprint else force)(\\{ # examplesIf}\n## Access Github Enterprise API\n## Use GITHUB_API_URL environment variable to change the default.\ngh(\"/user/repos\", type = \"public\", .api_url = \"https://github.foobar.edu/api/v3\")\n\\dontshow{\\}) # examplesIf}\n\\dontshow{if (FALSE) (if (getRversion() >= \"3.4\") withAutoprint else force)(\\{ # examplesIf}\n## Use I() to force body part to be sent as an array, even if length 1\n## This works whether assignees has length 1 or > 1\nassignees <- \"gh_user\"\nassignees <- c(\"gh_user1\", \"gh_user2\")\ngh(\"PATCH /repos/OWNER/REPO/issues/1\", assignees = I(assignees))\n\\dontshow{\\}) # examplesIf}\n\\dontshow{if (FALSE) (if (getRversion() >= \"3.4\") withAutoprint else force)(\\{ # examplesIf}\n## There are two ways to send JSON data. One is that you supply one or\n## more objects that will be converted to JSON automatically via\n## jsonlite::toJSON(). In this case sometimes you need to use\n## jsonlite::unbox() because fromJSON() creates lists from scalar vectors\n## by default. The Content-Type header is automatically added in this\n## case. For example this request turns on GitHub Pages, using this\n## API: https://docs.github.com/v3/repos/pages/#enable-a-pages-site\n\ngh::gh(\n  \"POST /repos/{owner}/{repo}/pages\",\n  owner = \"r-lib\",\n  repo = \"gh\",\n  source = list(\n    branch = jsonlite::unbox(\"gh-pages\"),\n    path = jsonlite::unbox(\"/\")\n  ),\n  .send_headers = c(Accept = \"application/vnd.github.switcheroo-preview+json\")\n)\n\n## The second way is to handle the JSON encoding manually, and supply it\n## as a raw vector in an unnamed argument, and also a Content-Type header:\n\nbody <- '{ \"source\": { \"branch\": \"gh-pages\", \"path\": \"/\" } }'\ngh::gh(\n  \"POST /repos/{owner}/{repo}/pages\",\n  owner = \"r-lib\",\n  repo = \"gh\",\n  charToRaw(body),\n  .send_headers = c(\n    Accept = \"application/vnd.github.switcheroo-preview+json\",\n    \"Content-Type\" = \"application/json\"\n  )\n)\n\\dontshow{\\}) # examplesIf}\n\\dontshow{if (FALSE) (if (getRversion() >= \"3.4\") withAutoprint else force)(\\{ # examplesIf}\n## Pass along a query to the search/code endpoint via the ... argument\nx <- gh::gh(\n            \"/search/code\",\n            q = \"installation repo:r-lib/gh\",\n            .send_headers = c(\"X-GitHub-Api-Version\" = \"2022-11-28\")\n            )\n str(x, list.len = 3, give.attr = FALSE)\n\n\\dontshow{\\}) # examplesIf}\n}\n\\seealso{\n\\code{\\link[=gh_gql]{gh_gql()}} if you want to use the GitHub GraphQL API,\n\\code{\\link[=gh_whoami]{gh_whoami()}} for details on GitHub API token management.\n}\n"
  },
  {
    "path": "man/gh_gql.Rd",
    "content": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/gh_gql.R\n\\name{gh_gql}\n\\alias{gh_gql}\n\\title{A simple interface for the GitHub GraphQL API v4.}\n\\usage{\ngh_gql(query, ...)\n}\n\\arguments{\n\\item{query}{The GraphQL query, as a string.}\n\n\\item{...}{Name-value pairs giving API parameters. Will be matched into\n\\code{endpoint} placeholders, sent as query parameters in GET requests, and as a\nJSON body of POST requests. If there is only one unnamed parameter, and it\nis a raw vector, then it will not be JSON encoded, but sent as raw data, as\nis. This can be used for example to add assets to releases. Named \\code{NULL}\nvalues are silently dropped. For GET requests, named \\code{NA} values trigger an\nerror. For other methods, named \\code{NA} values are included in the body of the\nrequest, as JSON \\code{null}.}\n}\n\\description{\nSee more about the GraphQL API here:\n\\url{https://docs.github.com/graphql}\n}\n\\details{\nNote: pagination and the \\code{.limit} argument does not work currently,\nas pagination in the GraphQL API is different from the v3 API.\nIf you need pagination with GraphQL, you'll need to do that manually.\n}\n\\examples{\n\\dontshow{if (FALSE) (if (getRversion() >= \"3.4\") withAutoprint else force)(\\{ # examplesIf}\ngh_gql(\"query { viewer { login }}\")\n\n# Get rate limit\nratelimit_query <- \"query {\n  viewer {\n    login\n  }\n  rateLimit {\n    limit\n    cost\n    remaining\n    resetAt\n  }\n}\"\n\ngh_gql(ratelimit_query)\n\\dontshow{\\}) # examplesIf}\n}\n\\seealso{\n\\code{\\link[=gh]{gh()}} for the GitHub v3 API.\n}\n"
  },
  {
    "path": "man/gh_next.Rd",
    "content": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/pagination.R\n\\name{gh_next}\n\\alias{gh_next}\n\\alias{gh_prev}\n\\alias{gh_first}\n\\alias{gh_last}\n\\title{Get the next, previous, first or last page of results}\n\\usage{\ngh_next(gh_response, .token = NULL, .send_headers = NULL)\n\ngh_prev(gh_response, .token = NULL, .send_headers = NULL)\n\ngh_first(gh_response, .token = NULL, .send_headers = NULL)\n\ngh_last(gh_response, .token = NULL, .send_headers = NULL)\n}\n\\arguments{\n\\item{gh_response}{An object returned by a \\code{\\link[=gh]{gh()}} call.}\n\n\\item{.token}{Authentication token. Defaults to \\code{\\link[=gh_token]{gh_token()}}.}\n\n\\item{.send_headers}{Named character vector of header field values\n(except \\code{Authorization}, which is handled via \\code{.token}). This can be\nused to override or augment the default \\code{User-Agent} header:\n\\code{\"https://github.com/r-lib/gh\"}.}\n}\n\\value{\nAnswer from the API.\n}\n\\description{\nGet the next, previous, first or last page of results\n}\n\\details{\nNote that these are not always defined. E.g. if the first\npage was queried (the default), then there are no first and previous\npages defined. If there is no next page, then there is no\nnext page defined, etc.\n\nIf the requested page does not exist, an error is thrown.\n}\n\\examples{\n\\dontshow{if (identical(Sys.getenv(\"IN_PKGDOWN\"), \"true\")) (if (getRversion() >= \"3.4\") withAutoprint else force)(\\{ # examplesIf}\nx <- gh(\"/users\")\nvapply(x, \"[[\", character(1), \"login\")\nx2 <- gh_next(x)\nvapply(x2, \"[[\", character(1), \"login\")\n\\dontshow{\\}) # examplesIf}\n}\n\\seealso{\nThe \\code{.limit} argument to \\code{\\link[=gh]{gh()}} supports fetching more than\none page.\n}\n"
  },
  {
    "path": "man/gh_rate_limit.Rd",
    "content": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/gh_rate_limit.R\n\\name{gh_rate_limit}\n\\alias{gh_rate_limit}\n\\alias{gh_rate_limits}\n\\title{Return GitHub user's current rate limits}\n\\usage{\ngh_rate_limit(\n  response = NULL,\n  .token = NULL,\n  .api_url = NULL,\n  .send_headers = NULL\n)\n\ngh_rate_limits(.token = NULL, .api_url = NULL, .send_headers = NULL)\n}\n\\arguments{\n\\item{response}{\\code{gh_response} object from a previous \\code{gh} call, rate\nlimit values are determined from values in the response header.\nOptional argument, if missing a call to \"GET /rate_limit\" will be made.}\n\n\\item{.token}{Authentication token. Defaults to \\code{\\link[=gh_token]{gh_token()}}.}\n\n\\item{.api_url}{Github API url (default: \\url{https://api.github.com}). Used\nif \\code{endpoint} just contains a path. Defaults to \\code{GITHUB_API_URL}\nenvironment variable if set.}\n\n\\item{.send_headers}{Named character vector of header field values\n(except \\code{Authorization}, which is handled via \\code{.token}). This can be\nused to override or augment the default \\code{User-Agent} header:\n\\code{\"https://github.com/r-lib/gh\"}.}\n}\n\\value{\nA \\code{list} object containing the overall \\code{limit}, \\code{remaining} limit, and the\nlimit \\code{reset} time.\n}\n\\description{\n\\code{gh_rate_limits()} reports on all rate limits for the authenticated user.\n\\code{gh_rate_limit()} reports on rate limits for previous successful request.\n\nFurther details on GitHub's API rate limit policies are available at\n\\url{https://docs.github.com/v3/#rate-limiting}.\n}\n"
  },
  {
    "path": "man/gh_token.Rd",
    "content": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/gh_token.R\n\\name{gh_token}\n\\alias{gh_token}\n\\alias{gh_token_exists}\n\\title{Return the local user's GitHub Personal Access Token (PAT)}\n\\usage{\ngh_token(api_url = NULL)\n\ngh_token_exists(api_url = NULL)\n}\n\\arguments{\n\\item{api_url}{GitHub API URL. Defaults to the \\code{GITHUB_API_URL} environment\nvariable, if set, and otherwise to \\url{https://api.github.com}.}\n}\n\\value{\nA string of characters, if a PAT is found, or the empty\nstring, otherwise. For convenience, the return value has an S3 class in\norder to ensure that simple printing strategies don't reveal the entire\nPAT.\n}\n\\description{\nIf gh can find a personal access token (PAT) via \\code{gh_token()}, it includes\nthe PAT in its requests. Some requests succeed without a PAT, but many\nrequire a PAT to prove the request is authorized by a specific GitHub user. A\nPAT also helps with rate limiting. If your gh use is more than casual, you\nwant a PAT.\n\ngh calls \\code{\\link[gitcreds:gitcreds_get]{gitcreds::gitcreds_get()}} with the \\code{api_url}, which checks session\nenvironment variables (\\code{GITHUB_PAT}, \\code{GITHUB_TOKEN})\nand then the local Git credential store for a PAT\nappropriate to the \\code{api_url}. Therefore, if you have previously used a PAT\nwith, e.g., command line Git, gh may retrieve and re-use it. You can call\n\\code{\\link[gitcreds:gitcreds_get]{gitcreds::gitcreds_get()}} directly, yourself, if you want to see what is\nfound for a specific URL. If no matching PAT is found,\n\\code{\\link[gitcreds:gitcreds_get]{gitcreds::gitcreds_get()}} errors, whereas \\code{gh_token()} does not and,\ninstead, returns \\code{\"\"}.\n\nSee GitHub's documentation on \\href{https://docs.github.com/en/github/authenticating-to-github/creating-a-personal-access-token}{Creating a personal access token},\nor use \\code{usethis::create_github_token()} for a guided experience, including\npre-selection of recommended scopes. Once you have a PAT, you can use\n\\code{\\link[gitcreds:gitcreds_get]{gitcreds::gitcreds_set()}} to add it to the Git credential store. From that\npoint on, gh (via \\code{\\link[gitcreds:gitcreds_get]{gitcreds::gitcreds_get()}}) should be able to find it\nwithout further effort on your part.\n}\n\\examples{\n\\dontrun{\ngh_token()\n\nformat(gh_token())\n\nstr(gh_token())\n}\n}\n"
  },
  {
    "path": "man/gh_tree_remote.Rd",
    "content": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/git.R\n\\name{gh_tree_remote}\n\\alias{gh_tree_remote}\n\\title{Find the GitHub remote associated with a path}\n\\usage{\ngh_tree_remote(path = \".\")\n}\n\\arguments{\n\\item{path}{Path that is contained within a git repo.}\n}\n\\value{\nIf the repo has a github remote, a list containing \\code{username}\nand \\code{repo}. Otherwise, an error.\n}\n\\description{\nThis is handy helper if you want to make gh requests related to the\ncurrent project.\n}\n\\examples{\n\\dontshow{if (interactive()) (if (getRversion() >= \"3.4\") withAutoprint else force)(\\{ # examplesIf}\ngh_tree_remote()\n\\dontshow{\\}) # examplesIf}\n}\n"
  },
  {
    "path": "man/gh_whoami.Rd",
    "content": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/gh_whoami.R\n\\name{gh_whoami}\n\\alias{gh_whoami}\n\\title{Info on current GitHub user and token}\n\\usage{\ngh_whoami(.token = NULL, .api_url = NULL, .send_headers = NULL)\n}\n\\arguments{\n\\item{.token}{Authentication token. Defaults to \\code{\\link[=gh_token]{gh_token()}}.}\n\n\\item{.api_url}{Github API url (default: \\url{https://api.github.com}). Used\nif \\code{endpoint} just contains a path. Defaults to \\code{GITHUB_API_URL}\nenvironment variable if set.}\n\n\\item{.send_headers}{Named character vector of header field values\n(except \\code{Authorization}, which is handled via \\code{.token}). This can be\nused to override or augment the default \\code{User-Agent} header:\n\\code{\"https://github.com/r-lib/gh\"}.}\n}\n\\value{\nA \\code{gh_response} object, which is also a \\code{list}.\n}\n\\description{\nReports wallet name, GitHub login, and GitHub URL for the current\nauthenticated user, the first bit of the token, and the associated scopes.\n}\n\\details{\nGet a personal access token for the GitHub API from\n\\url{https://github.com/settings/tokens} and select the scopes necessary for your\nplanned tasks. The \\code{repo} scope, for example, is one many are likely to need.\n\nOn macOS and Windows it is best to store the token in the git credential\nstore, where most GitHub clients, including gh, can access it. You can\nuse the gitcreds package to add your token to the credential store:\n\n\\if{html}{\\out{<div class=\"sourceCode r\">}}\\preformatted{gitcreds::gitcreds_set()\n}\\if{html}{\\out{</div>}}\n\nSee \\url{https://gh.r-lib.org/articles/managing-personal-access-tokens.html}\nand \\url{https://usethis.r-lib.org/articles/articles/git-credentials.html}\nfor more about managing GitHub (and generic git) credentials.\n\nOn other systems, including Linux, the git credential store is\ntypically not as convenient, and you might want to store your token in\nthe \\code{GITHUB_PAT} environment variable, which you can set in your\n\\code{.Renviron} file.\n}\n\\examples{\n\\dontshow{if (identical(Sys.getenv(\"IN_PKGDOWN\"), \"true\")) (if (getRversion() >= \"3.4\") withAutoprint else force)(\\{ # examplesIf}\ngh_whoami()\n\\dontshow{\\}) # examplesIf}\n\\dontshow{if (FALSE) (if (getRversion() >= \"3.4\") withAutoprint else force)(\\{ # examplesIf}\n## explicit token + use with GitHub Enterprise\ngh_whoami(\n  .token = \"8c70fd8419398999c9ac5bacf3192882193cadf2\",\n  .api_url = \"https://github.foobar.edu/api/v3\"\n)\n\\dontshow{\\}) # examplesIf}\n}\n"
  },
  {
    "path": "man/print.gh_response.Rd",
    "content": "% Generated by roxygen2: do not edit by hand\n% Please edit documentation in R/print.R\n\\name{print.gh_response}\n\\alias{print.gh_response}\n\\title{Print the result of a GitHub API call}\n\\usage{\n\\method{print}{gh_response}(x, ...)\n}\n\\arguments{\n\\item{x}{The result object.}\n\n\\item{...}{Ignored.}\n}\n\\value{\nThe JSON result.\n}\n\\description{\nPrint the result of a GitHub API call\n}\n"
  },
  {
    "path": "tests/testthat/_snaps/gh.md",
    "content": "# generates a useful message\n\n    Code\n      gh(\"/missing\")\n    Condition\n      Error in `gh()`:\n      ! GitHub API error (404): Not Found\n      x URL not found: <https://api.github.com/missing>\n      i Read more at <https://docs.github.com/rest>\n\n# can use per_page or .per_page but not both\n\n    Code\n      gh(\"/orgs/tidyverse/repos\", per_page = 1, .per_page = 2)\n    Condition\n      Error in `gh()`:\n      ! Exactly one of `per_page` or `.per_page` must be supplied.\n\n"
  },
  {
    "path": "tests/testthat/_snaps/gh_rate_limit.md",
    "content": "# errors\n\n    Code\n      gh_rate_limit(list())\n    Condition\n      Error in `gh_rate_limit()`:\n      ! inherits(response, \"gh_response\") is not TRUE\n    Code\n      gh_rate_limits(.token = \"bad\")\n    Condition\n      Error in `gh()`:\n      ! GitHub API error (401): Bad credentials\n      i Read more at <https://docs.github.com/rest>\n\n"
  },
  {
    "path": "tests/testthat/_snaps/gh_request.md",
    "content": "# gh_set_endpoint() refuses to substitute an NA\n\n    Code\n      gh_set_endpoint(input)\n    Condition\n      Error in `gh_set_endpoint()`:\n      ! Named NA parameters are not allowed: org\n\n# gh_make_request() errors if unknown verb\n\n    Unknown HTTP verb: \"GEEET\"\n\n"
  },
  {
    "path": "tests/testthat/_snaps/gh_response.md",
    "content": "# warns if output is HTML\n\n    Code\n      res <- gh(\"POST /markdown\", text = \"foo\")\n    Condition\n      Warning:\n      Response came back as html :(\n\n"
  },
  {
    "path": "tests/testthat/_snaps/gh_token.md",
    "content": "# get_baseurl() insists on http(s)\n\n    Code\n      get_baseurl(\"github.com\")\n    Condition\n      Error in `get_baseurl()`:\n      ! Only works with HTTP(S) protocols\n    Code\n      get_baseurl(\"github.acme.com\")\n    Condition\n      Error in `get_baseurl()`:\n      ! Only works with HTTP(S) protocols\n\n"
  },
  {
    "path": "tests/testthat/_snaps/gh_whoami.md",
    "content": "# whoami errors with bad/absent PAT\n\n    Code\n      gh_whoami(.token = \"\")\n    Message\n      No personal access token (PAT) available.\n      Obtain a PAT from here:\n      https://github.com/settings/tokens\n      For more on what to do with the PAT, see ?gh_whoami.\n    Code\n      gh_whoami(.token = NA)\n    Condition\n      Error in `gh()`:\n      ! GitHub API error (401): Requires authentication\n      i Read more at <https://docs.github.com/rest/users/users#get-the-authenticated-user>\n    Code\n      gh_whoami(.token = \"blah\")\n    Condition\n      Error in `gh()`:\n      ! GitHub API error (401): Bad credentials\n      i Read more at <https://docs.github.com/rest>\n\n"
  },
  {
    "path": "tests/testthat/_snaps/pagination.md",
    "content": "# can extract relative pages\n\n    Code\n      gh_prev(page1)\n    Condition\n      Error in `gh_link_request()`:\n      ! No prev page\n\n"
  },
  {
    "path": "tests/testthat/_snaps/print.md",
    "content": "# can print all types of object\n\n    Code\n      json\n    Output\n      {\n        \"name\": \"LICENSE\",\n        \"path\": \"LICENSE\",\n        \"sha\": \"c71242092c79fcc895841ca3e7de5bbcc551cde5\",\n        \"size\": 81,\n        \"url\": \"https://api.github.com/repos/r-lib/gh/contents/LICENSE?ref=v1.2.0\",\n        \"html_url\": \"https://github.com/r-lib/gh/blob/v1.2.0/LICENSE\",\n        \"git_url\": \"https://api.github.com/repos/r-lib/gh/git/blobs/c71242092c79fcc895841ca3e7de5bbcc551cde5\",\n        \"download_url\": \"https://raw.githubusercontent.com/r-lib/gh/v1.2.0/LICENSE\",\n        \"type\": \"file\",\n        \"content\": \"WUVBUjogMjAxNS0yMDIwCkNPUFlSSUdIVCBIT0xERVI6IEfDoWJvciBDc8Oh\\ncmRpLCBKZW5uaWZlciBCcnlhbiwgSGFkbGV5IFdpY2toYW0K\\n\",\n        \"encoding\": \"base64\",\n        \"_links\": {\n          \"self\": \"https://api.github.com/repos/r-lib/gh/contents/LICENSE?ref=v1.2.0\",\n          \"git\": \"https://api.github.com/repos/r-lib/gh/git/blobs/c71242092c79fcc895841ca3e7de5bbcc551cde5\",\n          \"html\": \"https://github.com/r-lib/gh/blob/v1.2.0/LICENSE\"\n        }\n      } \n    Code\n      file\n    Output\n      [1] \"LICENSE\"\n      attr(,\"class\")\n      [1] \"gh_response\" \"path\"       \n    Code\n      raw\n    Output\n       [1] 59 45 41 52 3a 20 32 30 31 35 2d 32 30 32 30 0a 43 4f 50 59 52 49 47 48 54\n      [26] 20 48 4f 4c 44 45 52 3a 20 47 c3 a1 62 6f 72 20 43 73 c3 a1 72 64 69 2c 20\n      [51] 4a 65 6e 6e 69 66 65 72 20 42 72 79 61 6e 2c 20 48 61 64 6c 65 79 20 57 69\n      [76] 63 6b 68 61 6d 0a\n      attr(,\"class\")\n      [1] \"gh_response\" \"raw\"        \n\n"
  },
  {
    "path": "tests/testthat/_snaps/utils.md",
    "content": "# named NA is error\n\n    Code\n      check_named_nas(tc)\n    Condition\n      Error in `check_named_nas()`:\n      ! Named NA parameters are not allowed: `a`\n\n---\n\n    Code\n      check_named_nas(tc)\n    Condition\n      Error in `check_named_nas()`:\n      ! Named NA parameters are not allowed: `a`\n\n---\n\n    Code\n      check_named_nas(tc)\n    Condition\n      Error in `check_named_nas()`:\n      ! Named NA parameters are not allowed: `c`\n\n"
  },
  {
    "path": "tests/testthat/helper-offline.R",
    "content": "skip_if_no_github <- function(has_scope = NULL) {\n  skip_if_offline(\"github.com\")\n  skip_on_cran()\n\n  if (gh_token() == \"\") {\n    skip(\"No GitHub token\")\n  }\n\n  if (!is.null(has_scope) && !has_scope %in% test_scopes()) {\n    skip(cli::format_inline(\"Current token lacks '{has_scope}' scope\"))\n  }\n}\n\ntest_scopes <- function() {\n  # whoami fails on GHA\n  whoami <- env_cache(\n    cache,\n    \"whoami\",\n    tryCatch(\n      gh_whoami(),\n      error = function(err) list(scopes = \"\")\n    )\n  )\n  strsplit(whoami$scopes, \", \")[[1]]\n}\n\ncache <- new_environment()\n"
  },
  {
    "path": "tests/testthat/helper.R",
    "content": "test_package_root <- function() {\n  x <- tryCatch(\n    rprojroot::find_package_root_file(),\n    error = function(e) NULL\n  )\n\n  if (!is.null(x)) {\n    return(x)\n  }\n\n  pkg <- testthat::testing_package()\n  x <- tryCatch(\n    rprojroot::find_package_root_file(\n      path = file.path(\"..\", \"..\", \"00_pkg_src\", pkg)\n    ),\n    error = function(e) NULL\n  )\n\n  if (!is.null(x)) {\n    return(x)\n  }\n\n  stop(\"Cannot find package root\")\n}\n"
  },
  {
    "path": "tests/testthat/setup.R",
    "content": "withr::local_options(\n  gh_cache = FALSE,\n  .local_envir = testthat::teardown_env()\n)\n"
  },
  {
    "path": "tests/testthat/test-gh.R",
    "content": "test_that(\"generates a useful message\", {\n  skip_if_no_github()\n\n  expect_snapshot(gh(\"/missing\"), error = TRUE)\n})\n\ntest_that(\"errors return a github_error object\", {\n  skip_if_no_github()\n\n  e <- tryCatch(gh(\"/missing\"), error = identity)\n\n  expect_s3_class(e, \"github_error\")\n  expect_s3_class(e, \"http_error_404\")\n})\n\ntest_that(\"can catch a given status directly\", {\n  skip_if_no_github()\n\n  e <- tryCatch(gh(\"/missing\"), \"http_error_404\" = identity)\n\n  expect_s3_class(e, \"github_error\")\n  expect_s3_class(e, \"http_error_404\")\n})\n\ntest_that(\"can ignore trailing commas\", {\n  skip_on_cran()\n  expect_no_error(gh(\"/orgs/tidyverse/repos\", ))\n})\n\ntest_that(\"can use per_page or .per_page but not both\", {\n  skip_on_cran()\n  resp <- gh(\"/orgs/tidyverse/repos\", per_page = 2)\n  expect_equal(attr(resp, \"request\")$query$per_page, 2)\n\n  resp <- gh(\"/orgs/tidyverse/repos\", .per_page = 2)\n  expect_equal(attr(resp, \"request\")$query$per_page, 2)\n\n  expect_snapshot(\n    error = TRUE,\n    gh(\"/orgs/tidyverse/repos\", per_page = 1, .per_page = 2)\n  )\n})\n\ntest_that(\"can paginate\", {\n  skip_on_cran()\n  pages <- gh(\n    \"/orgs/tidyverse/repos\",\n    per_page = 1,\n    .limit = 5,\n    .progress = FALSE\n  )\n  expect_length(pages, 5)\n})\n\ntest_that(\"trim output when .limit isn't a multiple of .per_page\", {\n  skip_on_cran()\n  pages <- gh(\n    \"/orgs/tidyverse/repos\",\n    per_page = 2,\n    .limit = 3,\n    .progress = FALSE\n  )\n  expect_length(pages, 3)\n})\n\ntest_that(\"can paginate repository search\", {\n  skip_on_cran()\n  # we need to run this sparingly, otherwise we'll get rate\n  # limited and the test fails\n  skip_on_ci()\n  pages <- gh(\n    \"/search/repositories\",\n    q = \"tidyverse\",\n    per_page = 10,\n    .limit = 35\n  )\n  expect_named(pages, c(\"total_count\", \"incomplete_results\", \"items\"))\n  # Eliminates aren't trimmed to .limit in this case\n  expect_length(pages$items, 40)\n})\n"
  },
  {
    "path": "tests/testthat/test-gh_rate_limit.R",
    "content": "test_that(\"good input\", {\n  mock_res <- structure(\n    list(),\n    class = \"gh_response\",\n    response = list(\n      \"x-ratelimit-limit\" = \"5000\",\n      \"x-ratelimit-remaining\" = \"4999\",\n      \"x-ratelimit-reset\" = \"1580507619\"\n    )\n  )\n\n  limit <- gh_rate_limit(mock_res)\n\n  expect_equal(limit$limit, 5000L)\n  expect_equal(limit$remaining, 4999L)\n  expect_s3_class(limit$reset, \"POSIXct\") # Avoiding tz issues\n})\n\ntest_that(\"errors\", {\n  expect_snapshot(error = TRUE, {\n    gh_rate_limit(list())\n    gh_rate_limits(.token = \"bad\")\n  })\n})\n\ntest_that(\"missing rate limit\", {\n  mock_res <- structure(\n    list(),\n    class = \"gh_response\",\n    response = list()\n  )\n\n  limit <- gh_rate_limit(mock_res)\n\n  expect_equal(limit$limit, NA_integer_)\n  expect_equal(limit$remaining, NA_integer_)\n  expect_equal(as.double(limit$reset), NA_real_)\n})\n"
  },
  {
    "path": "tests/testthat/test-gh_request.R",
    "content": "test_that(\"all forms of specifying endpoint are equivalent\", {\n  r1 <- gh_build_request(\"GET /rate_limit\")\n  expect_equal(r1$method, \"GET\")\n  expect_equal(r1$url, \"https://api.github.com/rate_limit\")\n\n  expect_equal(gh_build_request(\"/rate_limit\"), r1)\n  expect_equal(gh_build_request(\"GET https://api.github.com/rate_limit\"), r1)\n  expect_equal(gh_build_request(\"https://api.github.com/rate_limit\"), r1)\n})\n\ntest_that(\"method arg sets default method\", {\n  r <- gh_build_request(\"/rate_limit\", method = \"POST\")\n  expect_equal(r$method, \"POST\")\n})\n\ntest_that(\"parameter substitution is equivalent to direct specification (:)\", {\n  subst <-\n    gh_build_request(\n      \"POST /repos/:org/:repo/issues/:number/labels\",\n      params = list(\n        org = \"ORG\",\n        repo = \"REPO\",\n        number = \"1\",\n        \"body\"\n      )\n    )\n  spec <-\n    gh_build_request(\n      \"POST /repos/ORG/REPO/issues/1/labels\",\n      params = list(\"body\")\n    )\n  expect_identical(subst, spec)\n})\n\ntest_that(\"parameter substitution is equivalent to direct specification\", {\n  subst <-\n    gh_build_request(\n      \"POST /repos/{org}/{repo}/issues/{number}/labels\",\n      params = list(\n        org = \"ORG\",\n        repo = \"REPO\",\n        number = \"1\",\n        \"body\"\n      )\n    )\n  spec <-\n    gh_build_request(\n      \"POST /repos/ORG/REPO/issues/1/labels\",\n      params = list(\"body\")\n    )\n  expect_identical(subst, spec)\n})\n\ntest_that(\"URI templates that need expansion are detected\", {\n  expect_true(is_uri_template(\"/orgs/{org}/repos\"))\n  expect_true(is_uri_template(\"/repos/{owner}/{repo}\"))\n  expect_false(is_uri_template(\"/user/repos\"))\n})\n\ntest_that(\"older 'colon templates' are detected\", {\n  expect_true(is_colon_template(\"/orgs/:org/repos\"))\n  expect_true(is_colon_template(\"/repos/:owner/:repo\"))\n  expect_false(is_colon_template(\"/user/repos\"))\n})\n\ntest_that(\"gh_set_endpoint() works\", {\n  # no expansion, no extra params\n  input <- list(endpoint = \"/user/repos\")\n  expect_equal(input, gh_set_endpoint(input))\n\n  # no expansion, with extra params\n  input <- list(endpoint = \"/user/repos\", params = list(page = 2))\n  expect_equal(input, gh_set_endpoint(input))\n\n  # expansion, no extra params\n  input <- list(\n    endpoint = \"/repos/{owner}/{repo}\",\n    params = list(owner = \"OWNER\", repo = \"REPO\")\n  )\n  out <- gh_set_endpoint(input)\n  expect_equal(\n    out,\n    list(endpoint = \"/repos/OWNER/REPO\", params = list())\n  )\n\n  # expansion, with extra params\n  input <- list(\n    endpoint = \"/repos/{owner}/{repo}/issues\",\n    params = list(state = \"open\", owner = \"OWNER\", repo = \"REPO\", page = 2)\n  )\n  out <- gh_set_endpoint(input)\n  expect_equal(out$endpoint, \"/repos/OWNER/REPO/issues\")\n  expect_equal(out$params, list(state = \"open\", page = 2))\n})\n\ntest_that(\"gh_set_endpoint() refuses to substitute an NA\", {\n  input <- list(\n    endpoint = \"POST /orgs/{org}/repos\",\n    params = list(org = NA)\n  )\n  expect_snapshot(error = TRUE, gh_set_endpoint(input))\n})\n\ntest_that(\"gh_set_endpoint() allows a named NA in body for non-GET\", {\n  input <- list(\n    endpoint = \"PUT /repos/{owner}/{repo}/pages\",\n    params = list(owner = \"OWNER\", repo = \"REPO\", cname = NA)\n  )\n  out <- gh_set_endpoint(input)\n  expect_equal(out$endpoint, \"PUT /repos/OWNER/REPO/pages\")\n  expect_equal(out$params, list(cname = NA))\n})\n\ntest_that(\"gh_set_url() ensures URL is in 'API form'\", {\n  input <- list(\n    endpoint = \"/user/repos\",\n    api_url = \"https://github.com\"\n  )\n  out <- gh_set_url(input)\n  expect_equal(out$api_url, \"https://api.github.com\")\n\n  input$api_url <- \"https://github.acme.com\"\n  out <- gh_set_url(input)\n  expect_equal(out$api_url, \"https://github.acme.com/api/v3\")\n})\n\ntest_that(\"gh_make_request() errors if unknown verb\", {\n  expect_snapshot_error(gh(\"geeet /users/hadley/repos\", .limit = 2))\n})\n"
  },
  {
    "path": "tests/testthat/test-gh_response.R",
    "content": "test_that(\"works with empty bodies\", {\n  skip_if_no_github()\n\n  out <- gh(\"GET /orgs/{org}/repos\", org = \"gh-org-testing-no-repos\")\n  expect_equal(out, list(), ignore_attr = TRUE)\n\n  out <- gh(\"POST /markdown\", text = \"\")\n  expect_equal(out, list(), ignore_attr = TRUE)\n})\n\ntest_that(\"works with empty bodies from DELETE\", {\n  skip_if_no_github(has_scope = \"gist\")\n\n  out <- gh(\n    \"POST /gists\",\n    files = list(x = list(content = \"y\")),\n    public = FALSE\n  )\n  out <- gh(\"DELETE /gists/{gist_id}\", gist_id = out$id)\n  expect_equal(out, list(), ignore_attr = TRUE)\n})\n\ntest_that(\"can get raw response\", {\n  skip_if_no_github()\n\n  res <- gh(\n    \"GET /repos/{owner}/{repo}/contents/{path}\",\n    owner = \"r-lib\",\n    repo = \"gh\",\n    path = \"DESCRIPTION\",\n    .send_headers = c(Accept = \"application/vnd.github.v3.raw\")\n  )\n\n  expect_equal(\n    attr(res, \"response\")[[\"x-github-media-type\"]],\n    \"github.v3; param=raw\"\n  )\n  expect_equal(class(res), c(\"gh_response\", \"raw\"))\n})\n\ntest_that(\"can download files\", {\n  skip_if_no_github()\n\n  tmp <- withr::local_tempfile()\n  res_file <- gh(\n    \"/orgs/{org}/repos\",\n    org = \"r-lib\",\n    type = \"sources\",\n    .destfile = tmp\n  )\n  expect_equal(class(res_file), c(\"gh_response\", \"path\"))\n  expect_equal(res_file, tmp, ignore_attr = TRUE)\n})\n\ntest_that(\"warns if output is HTML\", {\n  skip_on_cran()\n  expect_snapshot(res <- gh(\"POST /markdown\", text = \"foo\"))\n\n  expect_equal(res, list(message = \"<p>foo</p>\\n\"), ignore_attr = TRUE)\n  expect_equal(class(res), c(\"gh_response\", \"list\"))\n})\n\ntest_that(\"captures details to recreate request\", {\n  skip_on_cran()\n  res <- gh(\"/orgs/{org}/repos\", org = \"r-lib\", .per_page = 1)\n\n  req <- attr(res, \"request\")\n  expect_type(req, \"list\")\n  expect_equal(req$url, \"https://api.github.com/orgs/r-lib/repos\")\n  expect_equal(req$query, list(per_page = 1))\n})\n\ntest_that(\"output file is not overwritten on error\", {\n  tmp <- withr::local_tempfile()\n  writeLines(\"foo\", tmp)\n\n  err <- tryCatch(\n    gh(\"/repos\", .destfile = tmp),\n    error = function(e) e\n  )\n\n  expect_true(file.exists(tmp))\n  expect_equal(readLines(tmp), \"foo\")\n  expect_true(!is.null((err$response_content)))\n})\n\n\ntest_that(\"gh_response objects can be combined via vctrs #161\", {\n  skip_on_cran()\n  skip_if_not_installed(\"vctrs\")\n  user_1 <- gh(\"/users\", .limit = 1)\n  user_2 <- gh(\"/users\", .limit = 1, )\n  user_vec <- vctrs::vec_c(user_1, user_2)\n  user_df <- vctrs::vec_rbind(user_1[[1]], user_2[[1]])\n  expect_equal(length(user_vec), 2)\n  expect_equal(nrow(user_df), 2)\n})\n"
  },
  {
    "path": "tests/testthat/test-gh_token.R",
    "content": "test_that(\"URL specific token is used\", {\n  good <- gh_pat(strrep(\"a\", 40))\n  good2 <- gh_pat(strrep(\"b\", 40))\n  bad <- gh_pat(strrep(\"0\", 40))\n  bad2 <- gh_pat(strrep(\"1\", 40))\n\n  env <- c(\n    GITHUB_API_URL = \"https://github.acme.com\",\n    GITHUB_PAT_GITHUB_ACME_COM = good,\n    GITHUB_PAT_GITHUB_ACME2_COM = good2,\n    GITHUB_PAT = bad,\n    GITHUB_TOKEN = bad2\n  )\n  withr::with_envvar(env, {\n    expect_equal(gh_token(), good)\n    expect_equal(gh_token(\"https://github.acme2.com\"), good2)\n  })\n\n  env <- c(\n    GITHUB_API_URL = NA,\n    GITHUB_PAT_GITHUB_COM = good,\n    GITHUB_PAT = bad,\n    GITHUB_TOKEN = bad2\n  )\n  withr::with_envvar(env, {\n    expect_equal(gh_token(), good)\n    expect_equal(gh_token(\"https://api.github.com\"), good)\n  })\n})\n\ntest_that(\"fall back to GITHUB_PAT, then GITHUB_TOKEN\", {\n  pat <- gh_pat(strrep(\"a\", 40))\n  token <- gh_pat(strrep(\"0\", 40))\n\n  env <- c(\n    GITHUB_API_URL = NA,\n    GITHUB_PAT_GITHUB_COM = NA,\n    GITHUB_PAT = pat,\n    GITHUB_TOKEN = token\n  )\n  withr::with_envvar(env, {\n    expect_equal(gh_token(), pat)\n    expect_equal(gh_token(\"https://api.github.com\"), pat)\n  })\n\n  env <- c(\n    GITHUB_API_URL = NA,\n    GITHUB_PAT_GITHUB_COM = NA,\n    GITHUB_PAT = NA,\n    GITHUB_TOKEN = token\n  )\n  withr::with_envvar(env, {\n    expect_equal(gh_token(), token)\n    expect_equal(gh_token(\"https://api.github.com\"), token)\n  })\n})\n\ntest_that(\"gh_token_exists works as expected\", {\n  withr::local_envvar(GITHUB_API_URL = \"https://test.com\")\n\n  withr::local_envvar(GITHUB_PAT_TEST_COM = NA)\n  expect_false(gh_token_exists())\n\n  withr::local_envvar(GITHUB_PAT_TEST_COM = gh_pat(strrep(\"0\", 40)))\n  expect_true(gh_token_exists())\n\n  withr::local_envvar(GITHUB_PAT_TEST_COM = \"invalid\")\n  expect_false(gh_token_exists())\n})\n\n# gh_pat class ----\ntest_that(\"validate_gh_pat() rejects bad characters, wrong # of characters\", {\n  # older PATs\n  expect_error(gh_pat(strrep(\"a\", 40)), NA)\n  expect_error(\n    gh_pat(strrep(\"g\", 40)),\n    \"40 hexadecimal digits\",\n    class = \"error\"\n  )\n  expect_error(gh_pat(\"aa\"), \"40 hexadecimal digits\", class = \"error\")\n\n  # newer PATs\n  expect_error(gh_pat(paste0(\"ghp_\", strrep(\"B\", 36))), NA)\n  expect_error(gh_pat(paste0(\"ghp_\", strrep(\"3\", 251))), NA)\n  expect_error(gh_pat(paste0(\"github_pat_\", strrep(\"A\", 36))), NA)\n  expect_error(gh_pat(paste0(\"github_pat_\", strrep(\"3\", 244))), NA)\n  expect_error(\n    gh_pat(paste0(\"ghJ_\", strrep(\"a\", 36))),\n    \"prefix\",\n    class = \"error\"\n  )\n  expect_error(\n    gh_pat(paste0(\"github_pa_\", strrep(\"B\", 244))),\n    \"github_pat_\",\n    class = \"error\"\n  )\n})\n\ntest_that(\"format.gh_pat() and str.gh_pat() hide the middle stuff\", {\n  pat <- paste0(strrep(\"a\", 10), strrep(\"4\", 20), strrep(\"F\", 10))\n  expect_match(format(gh_pat(pat)), \"[a-zA-Z]+\")\n  expect_output(str(gh_pat(pat)), \"[a-zA-Z]+\")\n})\n\ntest_that(\"str.gh_pat() indicates it's a `gh_pat`\", {\n  pat <- paste0(strrep(\"a\", 10), strrep(\"4\", 20), strrep(\"F\", 10))\n  expect_output(str(gh_pat(pat)), \"gh_pat\")\n})\n\ntest_that(\"format.gh_pat() handles empty string\", {\n  expect_match(format(gh_pat(\"\")), \"<no PAT>\")\n})\n\n# URL processing helpers ----\ntest_that(\"get_baseurl() insists on http(s)\", {\n  expect_snapshot(error = TRUE, {\n    get_baseurl(\"github.com\")\n    get_baseurl(\"github.acme.com\")\n  })\n})\n\ntest_that(\"get_baseurl() works\", {\n  x <- \"https://github.com\"\n  expect_equal(get_baseurl(\"https://github.com\"), x)\n  expect_equal(get_baseurl(\"https://github.com/\"), x)\n  expect_equal(get_baseurl(\"https://github.com/stuff\"), x)\n  expect_equal(get_baseurl(\"https://github.com/stuff/\"), x)\n  expect_equal(get_baseurl(\"https://github.com/more/stuff\"), x)\n\n  x <- \"https://api.github.com\"\n  expect_equal(get_baseurl(\"https://api.github.com\"), x)\n  expect_equal(get_baseurl(\"https://api.github.com/rate_limit\"), x)\n\n  x <- \"https://github.acme.com\"\n  expect_equal(get_baseurl(\"https://github.acme.com\"), x)\n  expect_equal(get_baseurl(\"https://github.acme.com/\"), x)\n  expect_equal(get_baseurl(\"https://github.acme.com/api/v3\"), x)\n\n  # so (what little) support we have for user@host doesn't regress\n  expect_equal(\n    get_baseurl(\"https://jane@github.acme.com/api/v3\"),\n    \"https://jane@github.acme.com\"\n  )\n})\n\ntest_that(\"is_github_dot_com() works\", {\n  expect_true(is_github_dot_com(\"https://github.com\"))\n  expect_true(is_github_dot_com(\"https://api.github.com\"))\n  expect_true(is_github_dot_com(\"https://api.github.com/rate_limit\"))\n  expect_true(is_github_dot_com(\"https://api.github.com/graphql\"))\n\n  expect_false(is_github_dot_com(\"https://github.acme.com\"))\n  expect_false(is_github_dot_com(\"https://github.acme.com/api/v3\"))\n  expect_false(is_github_dot_com(\"https://github.acme.com/api/v3/user\"))\n})\n\ntest_that(\"get_hosturl() works\", {\n  x <- \"https://github.com\"\n  expect_equal(get_hosturl(\"https://github.com\"), x)\n  expect_equal(get_hosturl(\"https://api.github.com\"), x)\n\n  x <- \"https://github.acme.com\"\n  expect_equal(get_hosturl(\"https://github.acme.com\"), x)\n  expect_equal(get_hosturl(\"https://github.acme.com/api/v3\"), x)\n})\n\ntest_that(\"get_apiurl() works\", {\n  x <- \"https://api.github.com\"\n  expect_equal(get_apiurl(\"https://github.com\"), x)\n  expect_equal(get_apiurl(\"https://github.com/\"), x)\n  expect_equal(get_apiurl(\"https://github.com/r-lib/gh/issues\"), x)\n  expect_equal(get_apiurl(\"https://api.github.com\"), x)\n  expect_equal(get_apiurl(\"https://api.github.com/rate_limit\"), x)\n\n  x <- \"https://github.acme.com/api/v3\"\n  expect_equal(get_apiurl(\"https://github.acme.com\"), x)\n  expect_equal(get_apiurl(\"https://github.acme.com/OWNER/REPO\"), x)\n  expect_equal(get_apiurl(\"https://github.acme.com/api/v3\"), x)\n})\n\ntest_that(\"tokens can be requested from a Connect server\", {\n  skip_if_not_installed(\"connectcreds\")\n\n  token <- strrep(\"a\", 40)\n  connectcreds::local_mocked_connect_responses(token = token)\n  expect_equal(gh_token(), gh_pat(token))\n})\n"
  },
  {
    "path": "tests/testthat/test-gh_whoami.R",
    "content": "test_that(\"whoami works in presence of PAT\", {\n  skip_if_no_github(has_scope = \"user\")\n\n  res <- gh_whoami()\n  expect_s3_class(res, \"gh_response\")\n  expect_match(res[[\"scopes\"]], \"\\\\buser\\\\b\")\n})\n\ntest_that(\"whoami errors with bad/absent PAT\", {\n  skip_if_no_github()\n  skip_on_ci() # since no token sometimes fails due to rate-limiting\n  withr::local_envvar(GH_FORCE_HTTP_1_1 = \"true\")\n\n  expect_snapshot(error = TRUE, {\n    gh_whoami(.token = \"\")\n    gh_whoami(.token = NA)\n    gh_whoami(.token = \"blah\")\n  })\n})\n"
  },
  {
    "path": "tests/testthat/test-git.R",
    "content": "test_that(\"picks origin if available\", {\n  remotes <- list(\n    upstream = \"https://github.com/x/1\",\n    origin = \"https://github.com/x/2\"\n  )\n\n  expect_warning(gr <- github_remote(remotes, \".\"), \"Using origin\")\n  expect_equal(gr$repo, \"2\")\n})\n\ntest_that(\"otherwise picks first\", {\n  remotes <- list(\n    a = \"https://github.com/x/1\",\n    b = \"https://github.com/x/2\"\n  )\n\n  expect_warning(gr <- github_remote(remotes, \".\"), \"Using first\")\n  expect_equal(gr$repo, \"1\")\n})\n\n\n# Parsing -----------------------------------------------------------------\n\ntest_that(\"parses common url forms\", {\n  expected <- list(username = \"x\", repo = \"y\")\n\n  expect_equal(github_remote_parse(\"https://github.com/x/y.git\"), expected)\n  expect_equal(github_remote_parse(\"https://github.com/x/y\"), expected)\n  expect_equal(github_remote_parse(\"git@github.com:x/y.git\"), expected)\n})\n\ntest_that(\"returns NULL if can't parse\", {\n  expect_equal(github_remote_parse(\"blah\"), NULL)\n})\n"
  },
  {
    "path": "tests/testthat/test-mock-repos.R",
    "content": "if (!exists(\"TMPL\", environment(), inherits = FALSE)) {\n  TMPL <- function(x) x\n}\n\ntest_that(\"repos, some basics\", {\n  skip_if_no_github()\n\n  res <- gh(\n    TMPL(\"/users/{username}/repos\"),\n    username = \"gaborcsardi\"\n  )\n  expect_true(all(c(\"id\", \"name\", \"full_name\") %in% names(res[[1]])))\n\n  res <- gh(\n    TMPL(\"/orgs/{org}/repos\"),\n    org = \"r-lib\",\n    type = \"sources\",\n    sort = \"full_name\"\n  )\n  expect_true(\"actions\" %in% vapply(res, \"[[\", \"name\", FUN.VALUE = \"\"))\n\n  res <- gh(\"/repositories\")\n  expect_true(all(c(\"id\", \"name\", \"full_name\") %in% names(res[[1]])))\n})\n\ntest_that(\"can POST, PATCH, and DELETE\", {\n  skip_if_no_github(has_scope = \"gist\")\n\n  res <- gh(\n    \"POST /gists\",\n    files = list(test.R = list(content = \"test\")),\n    description = \"A test gist for gh\",\n    public = FALSE\n  )\n  expect_equal(res$description, \"A test gist for gh\")\n  expect_false(res$public)\n\n  res <- gh(\n    TMPL(\"PATCH /gists/{gist_id}\"),\n    gist_id = res$id,\n    description = \"Still a test repo\"\n  )\n  expect_equal(res$description, \"Still a test repo\")\n\n  res <- gh(\n    TMPL(\"DELETE /gists/{gist_id}\"),\n    gist_id = res$id\n  )\n  expect_s3_class(res, c(\"gh_response\", \"list\"))\n})\n"
  },
  {
    "path": "tests/testthat/test-old-templates.R",
    "content": "TMPL <- function(x) {\n  gsub(\"[{]([^}]+)[}]\", \":\\\\1\", x)\n}\n\nsource(\"test-mock-repos.R\", local = TRUE)\n"
  },
  {
    "path": "tests/testthat/test-pagination.R",
    "content": "test_that(\"can extract relative pages\", {\n  skip_on_cran()\n  page1 <- gh(\"/orgs/tidyverse/repos\", per_page = 1)\n  expect_true(gh_has(page1, \"next\"))\n  expect_false(gh_has(page1, \"prev\"))\n\n  page2 <- gh_next(page1)\n  expect_equal(\n    attr(page2, \"request\")$url,\n    \"https://api.github.com/organizations/22032646/repos?per_page=1&page=2\"\n  )\n  expect_true(gh_has(page2, \"prev\"))\n\n  expect_snapshot(gh_prev(page1), error = TRUE)\n})\n\ntest_that(\"can paginate even when space re-encoded to +\", {\n  skip_on_cran()\n  json <- gh::gh(\n    \"GET /search/issues\",\n    q = 'label:\"tidy-dev-day :nerd_face:\"',\n    per_page = 10,\n    .limit = 20\n  )\n  expect_length(json$items, 20)\n})\n\ntest_that(\"paginated request gets max_wait and max_rate\", {\n  skip_on_cran()\n  gh <- gh(\"/orgs/tidyverse/repos\", per_page = 5, .max_wait = 1, .max_rate = 10)\n\n  req <- gh_link_request(gh, \"next\", .token = NULL, .send_headers = NULL)\n  expect_equal(req$max_wait, 1)\n  expect_equal(req$max_rate, 10)\n\n  url <- httr2::url_parse(req$url)\n  expect_equal(url$query$page, \"2\")\n})\n"
  },
  {
    "path": "tests/testthat/test-print.R",
    "content": "test_that(\"can print all types of object\", {\n  skip_on_cran()\n  local_options(gh_cache = FALSE)\n\n  get_license <- function(...) {\n    gh(\n      \"GET /repos/{owner}/{repo}/contents/{path}\",\n      owner = \"r-lib\",\n      repo = \"gh\",\n      path = \"LICENSE\",\n      ref = \"v1.2.0\",\n      ...\n    )\n  }\n\n  json <- get_license()\n  raw <- get_license(\n    .send_headers = c(Accept = \"application/vnd.github.v3.raw\")\n  )\n\n  path <- withr::local_file(test_path(\"LICENSE\"))\n  file <- get_license(\n    .destfile = path,\n    .send_headers = c(Accept = \"application/vnd.github.v3.raw\")\n  )\n\n  expect_snapshot({\n    json\n    file\n    raw\n  })\n})\n"
  },
  {
    "path": "tests/testthat/test-spelling.R",
    "content": "test_that(\"spelling\", {\n  skip_on_cran()\n  skip_on_covr()\n  pkgroot <- test_package_root()\n  err <- spelling::spell_check_package(pkgroot)\n  num_spelling_errors <- nrow(err)\n  expect_true(\n    num_spelling_errors == 0,\n    info = paste(\n      c(\"\\nSpelling errors:\", capture.output(err)),\n      collapse = \"\\n\"\n    )\n  )\n})\n"
  },
  {
    "path": "tests/testthat/test-utils.R",
    "content": "test_that(\"can detect presence vs absence names\", {\n  expect_identical(has_name(list(\"foo\", \"bar\")), c(FALSE, FALSE))\n  expect_identical(has_name(list(a = \"foo\", \"bar\")), c(TRUE, FALSE))\n\n  expect_identical(\n    has_name({\n      x <- list(\"foo\", \"bar\")\n      names(x)[1] <- \"a\"\n      x\n    }),\n    c(TRUE, FALSE)\n  )\n  expect_identical(\n    has_name({\n      x <- list(\"foo\", \"bar\")\n      names(x)[1] <- \"a\"\n      names(x)[2] <- \"\"\n      x\n    }),\n    c(TRUE, FALSE)\n  )\n\n  expect_identical(\n    has_name({\n      x <- list(\"foo\", \"bar\")\n      names(x)[1] <- \"\"\n      x\n    }),\n    c(FALSE, FALSE)\n  )\n  expect_identical(\n    has_name({\n      x <- list(\"foo\", \"bar\")\n      names(x)[1] <- \"\"\n      names(x)[2] <- \"\"\n      x\n    }),\n    c(FALSE, FALSE)\n  )\n})\n\ntest_that(\"named NULL is dropped\", {\n  tcs <- list(\n    list(list(), list()),\n    list(list(a = 1), list(a = 1)),\n    list(list(NULL), list(NULL)),\n    list(list(a = NULL), list()),\n    list(list(NULL, a = NULL, 1), list(NULL, 1)),\n    list(list(a = NULL, b = 1, 5), list(b = 1, 5))\n  )\n\n  for (tc in tcs) {\n    expect_identical(\n      drop_named_nulls(tc[[1]]),\n      tc[[2]],\n      info = tc\n    )\n  }\n})\n\ntest_that(\"named NA is error\", {\n  goodtcs <- list(\n    list(),\n    list(NA),\n    list(NA, NA_integer_, a = 1)\n  )\n\n  badtcs <- list(\n    list(b = NULL, a = NA),\n    list(a = NA_integer_),\n    list(NA, c = NA_real_)\n  )\n\n  for (tc in goodtcs) {\n    expect_silent(check_named_nas(tc))\n  }\n\n  for (tc in badtcs) {\n    expect_snapshot(error = TRUE, check_named_nas(tc))\n  }\n})\n\n\ntest_that(\".parse_params combines list .params with ... params\", {\n  params <- list(\n    .parse_params(org = \"ORG\", repo = \"REPO\", number = \"1\"),\n    .parse_params(org = \"ORG\", repo = \"REPO\", .params = list(number = \"1\")),\n    .parse_params(.params = list(org = \"ORG\", repo = \"REPO\", number = \"1\"))\n  )\n\n  expect_identical(params[[1]], params[[2]])\n  expect_identical(params[[2]], params[[3]])\n})\n"
  },
  {
    "path": "tests/testthat.R",
    "content": "library(testthat)\nlibrary(gh)\n\nif (Sys.getenv(\"NOT_CRAN\") == \"true\") {\n  test_check(\"gh\")\n}\n"
  },
  {
    "path": "vignettes/.gitignore",
    "content": "*.html\n*.R\n"
  },
  {
    "path": "vignettes/managing-personal-access-tokens.Rmd",
    "content": "---\ntitle: \"Managing Personal Access Tokens\"\noutput: rmarkdown::html_vignette\nvignette: >\n  %\\VignetteIndexEntry{Managing Personal Access Tokens}\n  %\\VignetteEngine{knitr::rmarkdown}\n  %\\VignetteEncoding{UTF-8}\n---\n\n```{r}\n#| include: false\nknitr::opts_chunk$set(\n  collapse = TRUE,\n  comment = \"#>\"\n)\n```\n\n```{r}\n#| label: setup\nlibrary(gh)\n```\n\n<!-- This vignette uses a convention of \"one sentence per line\" in prose. -->\n\ngh generally sends a Personal Access Token (PAT) with its requests.\nSome endpoints of the GitHub API can be accessed without authenticating yourself.\nBut once your API use becomes more frequent, you will want a PAT to prevent problems with rate limits and to access all possible endpoints.\n\nThis article describes how to store your PAT, so that gh can find it (automatically, in most cases). The function gh uses for this is `gh_token()`.\n\nMore resources on PAT management:\n\n  * GitHub documentation on [Creating a personal access token](https://docs.github.com/en/github/authenticating-to-github/creating-a-personal-access-token)\n    - Important: a PAT can expire, the default expiration date is 30 days. \n  * In the [usethis package](https://usethis.r-lib.org):\n    - Vignette: [Managing Git(Hub) Credentials](https://usethis.r-lib.org/articles/articles/git-credentials.html) \n    - `usethis::gh_token_help()` and `usethis::git_sitrep()` help you check if\n      a PAT is discoverable and has suitable scopes\n    - `usethis::create_github_token()` guides you through the process of getting\n      a new PAT\n  * In the [gitcreds package](https://gitcreds.r-lib.org/):\n    - `gitcreds::gitcreds_set()` helps you explicitly put your PAT into the Git\n      credential store\n  \n## PAT and host\n\n`gh::gh()` allows the user to provide a PAT via the `.token` argument and to specify a host other than \"github.com\" via the `.api_url` argument.\n(Some companies and universities run their own instance of GitHub Enterprise.)\n\n```{r}\n#| eval: false\ngh(endpoint, ..., .token = NULL, ..., .api_url = NULL, ...)\n```\n\nHowever, it's annoying to always provide your PAT or host and it's unsafe for your PAT to appear explicitly in your R code.\nIt's important to make it *possible* for the user to provide the PAT and/or API URL directly, but it should rarely be necessary.\n`gh::gh()` is designed to play well with more secure, less fiddly methods for expressing what you want.\n\nHow are `.api_url` and `.token` determined when the user does not provide them?\n\n  1. `.api_url` defaults to the value of the `GITHUB_API_URL` environment\n    variable and, if that is unset, falls back to `\"https://api.github.com\"`.\n    This is always done before worrying about the PAT.\n  1. The PAT is obtained via a call to `gh_token(.api_url)`. That is, the token\n    is looked up based on the host.\n\n## The gitcreds package\n\ngh now uses the gitcreds package to interact with the Git credential store.\n\ngh calls `gitcreds::gitcreds_get()` with a URL to try to find a matching PAT.\n`gitcreds::gitcreds_get()` checks session environment variables and then the local Git credential store.\nTherefore, if you have previously used a PAT with, e.g., command line Git, gh may retrieve and re-use it.\nYou can call `gitcreds::gitcreds_get()` directly, yourself, if you want to see what is found for a specific URL.\n\n``` r\ngitcreds::gitcreds_get()\n```\n\nIf you see something like this:\n``` r\n#> <gitcreds>\n#>   protocol: https\n#>   host    : github.com\n#>   username: PersonalAccessToken\n#>   password: <-- hidden -->\n```\nthat means that gitcreds could get the PAT from the Git credential store.\nYou can call `gitcreds_get()$password` to see the actual PAT.\n\nIf no matching PAT is found, `gitcreds::gitcreds_get()` errors.\n\n## PAT in an environment variable\n\nIf you don't have a Git installation, or your Git installation does not have a working credential store, then you can specify the PAT in an environment variable.\nFor `github.com` you can set the `GITHUB_PAT_GITHUB_COM` or `GITHUB_PAT` variable.\nFor a different GitHub host, call `gitcreds::gitcreds_cache_envvar()` with the API URL to see the environment variable you need to set.\nFor example:\n\n```{r}\ngitcreds::gitcreds_cache_envvar(\"https://github.acme.com\")\n```\n\n## Recommendations\n\nOn a machine used for interactive development, we recommend:\n\n  * Store your PAT(s) in an official credential store.\n  * Do **not** store your PAT(s) in plain text in, e.g., `.Renviron`. In the\n    past, this has been a common and recommended practice for pragmatic reasons.\n    However, gitcreds/gh have now evolved to the point where it's\n    possible for all of us to follow better security practices.\n  * If you use a general-purpose password manager, like 1Password or LastPass,\n    you may *also* want to store your PAT(s) there. Why? If your PAT is\n    \"forgotten\" from the OS-level credential store, intentionally or not, you'll\n    need to provide it again when prompted.\n    \n    If you don't have any other record of your PAT, you'll have to get a new\n    PAT whenever this happens. This is not the end of the world. But if you\n    aren't disciplined about deleting lost PATs from\n    <https://github.com/settings/tokens>, you will eventually find yourself in a\n    confusing situation where you can't be sure which PAT(s) are in use.\n\nOn a headless system, such as on a CI/CD platform, provide the necessary PAT(s) via secure environment variables.\nRegular environment variables can be used to configure less sensitive settings, such as the API host.\nDon't expose your PAT by doing something silly like dumping all environment variables to a log file.\n\nNote that on GitHub Actions, specifically, a personal access token is [automatically available to the workflow](https://docs.github.com/en/actions/configuring-and-managing-workflows/authenticating-with-the-github_token) as the `GITHUB_TOKEN` secret.\nThat is why many workflows in the R community contain this snippet:\n\n``` yaml\nenv:\n  GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}\n```\n\nThis makes the automatic PAT available as the `GITHUB_PAT` environment variable.\nIf that PAT doesn't have the right permissions, then you'll need to explicitly provide one that does (see link above for more).\n\n## Failure\n\nIf there is no PAT to be had, `gh::gh()` sends a request with no token.\n(Internally, the `Authorization` header is omitted if the PAT is found to be the empty string, `\"\"`.)\n\nWhat do PAT-related failures look like?\n\nIf no PAT is sent and the endpoint requires no auth, the request probably succeeds!\nAt least until you run up against rate limits.\nIf the endpoint requires auth, you'll get an HTTP error, possibly this one:\n\n```\nGitHub API error (401): 401 Unauthorized\nMessage: Requires authentication\n```\n\nIf a PAT is first discovered in an environment variable, it is taken at face value.\nThe two most common ways to arrive here are PAT specification via `.Renviron` or as a secret in a CI/CD platform, such as GitHub Actions.\nIf the PAT is invalid, the first affected request will fail, probably like so:\n\n```\nGitHub API error (401): 401 Unauthorized\nMessage: Bad credentials\n```\n\nThis will also be the experience if an invalid PAT is provided directly via `.token`.\n\nEven a valid PAT can lead to a downstream error, if it has insufficient scopes with respect to a specific request.\n"
  }
]